Apr 17 18:49:16.118077 ip-10-0-141-118 systemd[1]: Starting Kubernetes Kubelet... Apr 17 18:49:16.536629 ip-10-0-141-118 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:16.536629 ip-10-0-141-118 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 18:49:16.536629 ip-10-0-141-118 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:16.536629 ip-10-0-141-118 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 18:49:16.536629 ip-10-0-141-118 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:49:16.540526 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.540378 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 18:49:16.542881 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542860 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:16.542881 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542876 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:16.542881 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542882 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:16.542881 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542887 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542891 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542895 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542899 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542903 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542907 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542911 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542914 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542918 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542921 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542925 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542929 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542940 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542945 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542949 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542953 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542956 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542960 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542965 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542968 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:16.543134 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542972 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542976 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542979 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542987 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542993 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.542998 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543002 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543007 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543012 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543016 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543021 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543025 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543029 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543033 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543039 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543043 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543047 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543051 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543055 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543059 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:16.543994 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543063 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543068 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543072 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543076 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543080 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543084 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543089 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543092 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543096 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543100 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543104 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543108 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543112 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543116 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543121 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543125 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543129 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543133 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543137 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:16.544654 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543140 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543145 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543149 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543155 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543158 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543163 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543167 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543176 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543183 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543188 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543193 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543197 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543201 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543205 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543209 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543214 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543218 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543222 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543226 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543231 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:16.545212 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543234 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543239 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543243 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543247 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543907 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543916 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543921 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543926 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543931 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543935 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543939 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543943 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543947 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543952 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543956 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543960 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543964 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543968 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543972 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543977 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:16.545901 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543982 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543986 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543990 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543994 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.543998 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544002 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544006 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544010 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544016 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544022 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544027 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544032 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544037 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544041 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544045 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544049 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544053 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544057 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544061 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544065 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:16.546809 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544069 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544073 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544078 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544081 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544086 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544090 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544094 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544098 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544102 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544106 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544110 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544115 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544121 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544125 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544129 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544133 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544137 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544141 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544145 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544149 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:16.547375 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544154 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544158 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544164 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544170 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544175 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544179 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544186 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544191 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544196 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544201 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544206 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544210 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544214 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544219 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544223 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544227 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544231 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544235 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544239 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544243 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:16.547975 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544247 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544251 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544255 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544260 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544264 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544268 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544272 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544276 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544280 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.544284 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545485 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545503 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545513 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545521 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545528 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545534 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545541 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545549 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545555 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545560 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545566 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545572 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 18:49:16.548699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545577 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545582 2574 flags.go:64] FLAG: --cgroup-root="" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545587 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545591 2574 flags.go:64] FLAG: --client-ca-file="" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545596 2574 flags.go:64] FLAG: --cloud-config="" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545601 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545605 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545613 2574 flags.go:64] FLAG: --cluster-domain="" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545618 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545623 2574 flags.go:64] FLAG: --config-dir="" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545628 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545633 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545639 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545644 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545650 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545655 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545659 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545664 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545669 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545673 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545679 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545686 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545691 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545696 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545700 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 18:49:16.549267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545705 2574 flags.go:64] FLAG: --enable-server="true" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545710 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545717 2574 flags.go:64] FLAG: --event-burst="100" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545723 2574 flags.go:64] FLAG: --event-qps="50" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545728 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545733 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545738 2574 flags.go:64] FLAG: --eviction-hard="" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545745 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545749 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545754 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545760 2574 flags.go:64] FLAG: --eviction-soft="" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545784 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545789 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545794 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545799 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545804 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545808 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545813 2574 flags.go:64] FLAG: --feature-gates="" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545819 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545824 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545830 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545835 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545841 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545845 2574 flags.go:64] FLAG: --help="false" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545850 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.549960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545855 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545860 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545866 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545872 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545877 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545882 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545887 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545892 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545896 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545901 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545906 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545911 2574 flags.go:64] FLAG: --kube-reserved="" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545916 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545920 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545925 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545929 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545934 2574 flags.go:64] FLAG: --lock-file="" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545939 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545944 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545949 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545957 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545962 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545967 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 18:49:16.550551 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545971 2574 flags.go:64] FLAG: --logging-format="text" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545976 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545982 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545986 2574 flags.go:64] FLAG: --manifest-url="" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545991 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.545998 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546003 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546010 2574 flags.go:64] FLAG: --max-pods="110" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546015 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546020 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546025 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546030 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546035 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546040 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546044 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546058 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546063 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546068 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546073 2574 flags.go:64] FLAG: --pod-cidr="" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546078 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546087 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546094 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546100 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546104 2574 flags.go:64] FLAG: --port="10250" Apr 17 18:49:16.551116 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546109 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546114 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0866d80a003749719" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546119 2574 flags.go:64] FLAG: --qos-reserved="" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546124 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546128 2574 flags.go:64] FLAG: --register-node="true" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546133 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546138 2574 flags.go:64] FLAG: --register-with-taints="" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546144 2574 flags.go:64] FLAG: --registry-burst="10" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546149 2574 flags.go:64] FLAG: --registry-qps="5" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546153 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546158 2574 flags.go:64] FLAG: --reserved-memory="" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546163 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546168 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546173 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546178 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546183 2574 flags.go:64] FLAG: --runonce="false" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546188 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546194 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546199 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546204 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546208 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546213 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546218 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546223 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546228 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546233 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 18:49:16.551688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546237 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546242 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546247 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546252 2574 flags.go:64] FLAG: --system-cgroups="" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546257 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546266 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546271 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546276 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546283 2574 flags.go:64] FLAG: --tls-min-version="" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546288 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546293 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546298 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546303 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546307 2574 flags.go:64] FLAG: --v="2" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546314 2574 flags.go:64] FLAG: --version="false" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546325 2574 flags.go:64] FLAG: --vmodule="" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546331 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.546341 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546493 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546504 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546509 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546515 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546520 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:16.552336 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546525 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546530 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546536 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546540 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546544 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546549 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546554 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546558 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546562 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546567 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546571 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546575 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546579 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546583 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546588 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546593 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546597 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546601 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546605 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546609 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:16.552955 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546612 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546616 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546621 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546624 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546629 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546635 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546642 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546646 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546650 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546654 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546659 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546663 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546666 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546670 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546676 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546680 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546684 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546688 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546692 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:16.553483 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546696 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546700 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546704 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546708 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546712 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546715 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546720 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546724 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546729 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546733 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546737 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546741 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546745 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546749 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546753 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546757 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546780 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546784 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546789 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546794 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:16.554206 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546798 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546802 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546806 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546810 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546813 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546817 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546821 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546826 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546830 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546834 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546838 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546843 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546847 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546851 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546855 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546859 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546863 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546867 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546872 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546876 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:16.554984 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546880 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.546885 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.547522 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.555484 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.555503 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555582 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555590 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555595 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555599 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555604 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555608 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555613 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555618 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555623 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555627 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555632 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:16.555935 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555637 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555641 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555646 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555650 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555654 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555659 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555664 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555668 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555673 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555679 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555683 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555687 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555693 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555697 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555702 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555707 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555713 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555717 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555725 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:16.556615 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555732 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555737 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555742 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555746 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555751 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555755 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555759 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555800 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555805 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555809 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555813 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555816 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555820 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555824 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555828 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555831 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555835 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555839 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555843 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555847 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:16.557156 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555850 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555854 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555857 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555861 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555865 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555869 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555873 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555878 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555882 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555887 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555890 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555894 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555899 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555904 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555908 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555913 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555917 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555922 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555926 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:16.557719 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555930 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555936 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555941 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555945 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555949 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555953 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555957 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555963 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555967 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555971 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555975 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555979 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555983 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555988 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555991 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555995 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:16.558415 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.555999 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.556007 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556167 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556176 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556181 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556185 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556190 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556194 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556198 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556205 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556212 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556216 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556221 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556226 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556230 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:49:16.558888 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556234 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556239 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556243 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556247 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556251 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556255 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556259 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556263 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556267 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556273 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556277 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556281 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556285 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556289 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556292 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556296 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556300 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556304 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556311 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556316 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:49:16.559272 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556320 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556324 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556328 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556332 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556336 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556340 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556344 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556349 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556353 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556357 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556361 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556365 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556369 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556373 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556377 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556381 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556385 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556390 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556394 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:49:16.560129 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556398 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556402 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556406 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556411 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556416 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556419 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556423 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556428 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556432 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556436 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556440 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556444 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556448 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556452 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556456 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556461 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556464 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556469 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556473 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556477 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:49:16.560701 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556483 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556487 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556491 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556495 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556499 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556505 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556509 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556513 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556517 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556522 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556526 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556529 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556533 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:16.556537 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.556545 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:49:16.561286 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.557306 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 18:49:16.561677 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.561004 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 18:49:16.561864 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.561852 2574 server.go:1019] "Starting client certificate rotation" Apr 17 18:49:16.561982 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.561963 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:49:16.562012 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.562003 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:49:16.587470 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.587448 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:49:16.589833 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.589811 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:49:16.602431 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.602413 2574 log.go:25] "Validated CRI v1 runtime API" Apr 17 18:49:16.607828 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.607811 2574 log.go:25] "Validated CRI v1 image API" Apr 17 18:49:16.610745 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.610710 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 18:49:16.612080 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.612061 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:49:16.617272 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.617248 2574 fs.go:135] Filesystem UUIDs: map[32ee3041-a478-4587-b2bb-6d55ce759075:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 7c92aaba-3e7f-4c18-b682-2551827bf6f3:/dev/nvme0n1p4] Apr 17 18:49:16.617360 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.617270 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 18:49:16.622617 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.622513 2574 manager.go:217] Machine: {Timestamp:2026-04-17 18:49:16.621453235 +0000 UTC m=+0.394197633 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3125171 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2574cbf333c65e6f33bcca2dcb3c9c SystemUUID:ec2574cb-f333-c65e-6f33-bcca2dcb3c9c BootID:25192705-83db-447e-8970-6623829e444b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f3:e9:e7:58:47 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f3:e9:e7:58:47 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:13:fd:a8:ae:de Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 18:49:16.622617 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.622613 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 18:49:16.622728 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.622695 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 18:49:16.623014 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.622994 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 18:49:16.623160 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.623016 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-118.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 18:49:16.623202 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.623170 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 18:49:16.623202 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.623179 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 18:49:16.623202 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.623197 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:49:16.623286 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.623212 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:49:16.624457 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.624445 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:49:16.624563 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.624554 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 18:49:16.627119 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.627108 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 17 18:49:16.627158 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.627127 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 18:49:16.627807 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.627798 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 18:49:16.627841 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.627811 2574 kubelet.go:397] "Adding apiserver pod source" Apr 17 18:49:16.627841 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.627819 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 18:49:16.628853 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.628837 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:49:16.628936 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.628858 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:49:16.631443 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.631415 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 18:49:16.632643 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.632631 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 18:49:16.634404 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634390 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634408 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634415 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634420 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634427 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634433 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634439 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634458 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634465 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 18:49:16.634477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634472 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 18:49:16.634909 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634487 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 18:49:16.634909 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.634497 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 18:49:16.635086 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.635068 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wlxth" Apr 17 18:49:16.635271 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.635262 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 18:49:16.635303 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.635272 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 18:49:16.638605 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.638581 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-118.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 18:49:16.638679 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.638624 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 18:49:16.638679 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.638639 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 18:49:16.638852 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.638840 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 18:49:16.638892 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.638874 2574 server.go:1295] "Started kubelet" Apr 17 18:49:16.638959 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.638935 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 18:49:16.639108 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.639071 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 18:49:16.639163 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.639133 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 18:49:16.639631 ip-10-0-141-118 systemd[1]: Started Kubernetes Kubelet. Apr 17 18:49:16.640150 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.640097 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 18:49:16.641167 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.641150 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wlxth" Apr 17 18:49:16.641750 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.641734 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 17 18:49:16.646960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.646942 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 18:49:16.647381 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.647361 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 18:49:16.647474 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.646648 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-118.ec2.internal.18a73975eb13d6bd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-118.ec2.internal,UID:ip-10-0-141-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-118.ec2.internal,},FirstTimestamp:2026-04-17 18:49:16.638852797 +0000 UTC m=+0.411597195,LastTimestamp:2026-04-17 18:49:16.638852797 +0000 UTC m=+0.411597195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-118.ec2.internal,}" Apr 17 18:49:16.647923 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.647909 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 18:49:16.647923 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.647922 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 18:49:16.648034 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648011 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 18:49:16.648078 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648059 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 17 18:49:16.648078 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648064 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 17 18:49:16.648192 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.648175 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:16.648648 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648634 2574 factory.go:55] Registering systemd factory Apr 17 18:49:16.648717 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648654 2574 factory.go:223] Registration of the systemd container factory successfully Apr 17 18:49:16.648879 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648838 2574 factory.go:153] Registering CRI-O factory Apr 17 18:49:16.648947 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648886 2574 factory.go:223] Registration of the crio container factory successfully Apr 17 18:49:16.648947 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648940 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 18:49:16.649040 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648967 2574 factory.go:103] Registering Raw factory Apr 17 18:49:16.649040 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.648983 2574 manager.go:1196] Started watching for new ooms in manager Apr 17 18:49:16.649517 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.649476 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 18:49:16.649606 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.649590 2574 manager.go:319] Starting recovery of all containers Apr 17 18:49:16.652874 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.652854 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:16.656029 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.655923 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-118.ec2.internal\" not found" node="ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.658496 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.658333 2574 manager.go:324] Recovery completed Apr 17 18:49:16.665781 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.665750 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:16.668337 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.668321 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:16.668388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.668353 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:16.668388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.668363 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:16.668902 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.668886 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 18:49:16.668902 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.668899 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 18:49:16.669019 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.668915 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:49:16.670805 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.670793 2574 policy_none.go:49] "None policy: Start" Apr 17 18:49:16.670844 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.670810 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 18:49:16.670844 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.670820 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 17 18:49:16.720423 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.720399 2574 manager.go:341] "Starting Device Plugin manager" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.720446 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.720459 2574 server.go:85] "Starting device plugin registration server" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.720739 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.720750 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.720895 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.720960 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.720968 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.721484 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 18:49:16.723990 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.721526 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:16.785662 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.785627 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 18:49:16.786983 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.786931 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 18:49:16.787468 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.787448 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 18:49:16.787575 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.787497 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 18:49:16.787575 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.787507 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 18:49:16.787575 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.787542 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 18:49:16.790573 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.790550 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:16.821518 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.821496 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:16.822446 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.822424 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:16.822549 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.822454 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:16.822549 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.822480 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:16.822549 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.822503 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.830206 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.830189 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.830268 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.830216 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-118.ec2.internal\": node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:16.843953 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.843927 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:16.888229 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.888186 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal"] Apr 17 18:49:16.888331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.888274 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:16.889194 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.889180 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:16.889260 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.889211 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:16.889260 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.889221 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:16.890342 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.890331 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:16.890488 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.890475 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.890521 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.890502 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:16.891068 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.891043 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:16.891175 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.891074 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:16.891175 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.891043 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:16.891175 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.891107 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:16.891175 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.891118 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:16.891175 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.891085 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:16.892243 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.892225 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.892336 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.892256 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:49:16.892941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.892926 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:49:16.893036 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.892954 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:49:16.893036 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.892968 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:49:16.913430 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.913405 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-118.ec2.internal\" not found" node="ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.917833 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.917818 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-118.ec2.internal\" not found" node="ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.944897 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:16.944877 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:16.949185 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.949170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/de49da82ba182640048bc9ceae0a365f-config\") pod \"kube-apiserver-proxy-ip-10-0-141-118.ec2.internal\" (UID: \"de49da82ba182640048bc9ceae0a365f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.949241 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.949195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f6a008543323d6e8e1d6e408e1161d0c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal\" (UID: \"f6a008543323d6e8e1d6e408e1161d0c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:16.949241 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:16.949213 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a008543323d6e8e1d6e408e1161d0c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal\" (UID: \"f6a008543323d6e8e1d6e408e1161d0c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.045314 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:17.045227 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:17.049534 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.049503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/de49da82ba182640048bc9ceae0a365f-config\") pod \"kube-apiserver-proxy-ip-10-0-141-118.ec2.internal\" (UID: \"de49da82ba182640048bc9ceae0a365f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.049614 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.049572 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/de49da82ba182640048bc9ceae0a365f-config\") pod \"kube-apiserver-proxy-ip-10-0-141-118.ec2.internal\" (UID: \"de49da82ba182640048bc9ceae0a365f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.049675 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.049618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f6a008543323d6e8e1d6e408e1161d0c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal\" (UID: \"f6a008543323d6e8e1d6e408e1161d0c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.049675 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.049655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a008543323d6e8e1d6e408e1161d0c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal\" (UID: \"f6a008543323d6e8e1d6e408e1161d0c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.049758 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.049703 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f6a008543323d6e8e1d6e408e1161d0c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal\" (UID: \"f6a008543323d6e8e1d6e408e1161d0c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.049758 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.049709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a008543323d6e8e1d6e408e1161d0c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal\" (UID: \"f6a008543323d6e8e1d6e408e1161d0c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.146028 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:17.145986 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:17.215503 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.215468 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.219970 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.219951 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.246614 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:17.246578 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:17.347374 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:17.347336 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:17.447810 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:17.447757 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:17.548366 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:17.548333 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:17.561679 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.561644 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 18:49:17.561832 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.561816 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:49:17.561895 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.561836 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:49:17.645821 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.645724 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 18:44:16 +0000 UTC" deadline="2027-12-28 22:39:10.548185696 +0000 UTC" Apr 17 18:49:17.645821 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.645755 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14883h49m52.902433794s" Apr 17 18:49:17.647045 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.647027 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 18:49:17.649085 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:17.649067 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:17.656043 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.656022 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:49:17.674644 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.674626 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8wknt" Apr 17 18:49:17.681837 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.681810 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8wknt" Apr 17 18:49:17.749346 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:17.749307 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-118.ec2.internal\" not found" Apr 17 18:49:17.770802 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.770776 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:17.812699 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:17.812669 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde49da82ba182640048bc9ceae0a365f.slice/crio-78b29cbbf012405669a867cf82e875a5a7c5fb2450e7932c6452c6b671eabb55 WatchSource:0}: Error finding container 78b29cbbf012405669a867cf82e875a5a7c5fb2450e7932c6452c6b671eabb55: Status 404 returned error can't find the container with id 78b29cbbf012405669a867cf82e875a5a7c5fb2450e7932c6452c6b671eabb55 Apr 17 18:49:17.812969 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:17.812952 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a008543323d6e8e1d6e408e1161d0c.slice/crio-7b6e7cbef8b5ee579443f5163991303a7a46fb4d0da5142916c09b04d67ea1ad WatchSource:0}: Error finding container 7b6e7cbef8b5ee579443f5163991303a7a46fb4d0da5142916c09b04d67ea1ad: Status 404 returned error can't find the container with id 7b6e7cbef8b5ee579443f5163991303a7a46fb4d0da5142916c09b04d67ea1ad Apr 17 18:49:17.817006 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.816980 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:49:17.848548 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.848525 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.858529 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.858504 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:49:17.860220 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.860206 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" Apr 17 18:49:17.869701 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:17.869684 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:49:18.128844 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.128817 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:18.629045 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.629008 2574 apiserver.go:52] "Watching apiserver" Apr 17 18:49:18.638393 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.638360 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 18:49:18.639947 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.639914 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-t2r2f","openshift-dns/node-resolver-fzlng","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal","openshift-network-operator/iptables-alerter-m7v2p","kube-system/konnectivity-agent-j7fsm","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z","openshift-image-registry/node-ca-n7gs7","openshift-multus/multus-additional-cni-plugins-4xm2n","openshift-multus/multus-xc74x","openshift-multus/network-metrics-daemon-dpqmj","openshift-network-diagnostics/network-check-target-4wxbg","openshift-ovn-kubernetes/ovnkube-node-rj69g","kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal"] Apr 17 18:49:18.642293 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.642275 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.643379 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.643357 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.644914 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.644565 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2vzsf\"" Apr 17 18:49:18.644914 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.644589 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 18:49:18.644914 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.644678 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 18:49:18.644914 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.644780 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 18:49:18.645480 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.645459 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 18:49:18.645480 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.645475 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 18:49:18.645625 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.645553 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:18.645732 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.645713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xlxz2\"" Apr 17 18:49:18.647090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.647070 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.647847 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.647693 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 18:49:18.647847 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.647736 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 18:49:18.647847 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.647795 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rqpqv\"" Apr 17 18:49:18.648640 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.648242 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.648640 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.648315 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.649053 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.649037 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 18:49:18.649285 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.649237 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 18:49:18.649362 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.649332 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74dfs\"" Apr 17 18:49:18.649852 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.649414 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 18:49:18.650900 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.650262 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:49:18.650900 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.650520 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gczng\"" Apr 17 18:49:18.650900 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.650617 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 18:49:18.651062 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.651035 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.651298 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.651278 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 18:49:18.651542 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.651520 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b6ngv\"" Apr 17 18:49:18.651631 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.651548 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:49:18.651631 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.651548 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 18:49:18.652561 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.652539 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.653037 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.653013 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8mcwb\"" Apr 17 18:49:18.653153 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.653140 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 18:49:18.653332 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.653321 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 18:49:18.653400 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.653386 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 18:49:18.653462 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.653452 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 18:49:18.654082 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.654064 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:18.654164 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:18.654144 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:18.654229 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.654067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:18.654634 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:18.654301 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:18.655377 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.654935 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dn6ct\"" Apr 17 18:49:18.655377 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.655066 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 18:49:18.655488 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.654935 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 18:49:18.657106 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rm29\" (UniqueName: \"kubernetes.io/projected/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-kube-api-access-9rm29\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.657200 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-kubernetes\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657200 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657177 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-lib-modules\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657311 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657092 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.657311 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a6ece008-1cf5-4646-9354-1111778d622b-tmp\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657311 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/19201734-1263-46e9-b401-4768c56c505c-hosts-file\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.657440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2549m\" (UniqueName: \"kubernetes.io/projected/19201734-1263-46e9-b401-4768c56c505c-kube-api-access-2549m\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.657440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657346 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.657440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657382 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysctl-d\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-host\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657621 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7681ddb2-af4c-468a-93b5-9e7d47992b0f-agent-certs\") pod \"konnectivity-agent-j7fsm\" (UID: \"7681ddb2-af4c-468a-93b5-9e7d47992b0f\") " pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:18.657621 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.657621 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-modprobe-d\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657621 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657561 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-systemd\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657621 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-run\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657877 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-var-lib-kubelet\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.657877 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657782 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7681ddb2-af4c-468a-93b5-9e7d47992b0f-konnectivity-ca\") pod \"konnectivity-agent-j7fsm\" (UID: \"7681ddb2-af4c-468a-93b5-9e7d47992b0f\") " pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:18.657877 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657825 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/075172fd-6f0b-45b8-8765-5b6397bdb2b8-serviceca\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.657877 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657862 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.658060 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.658060 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657923 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdw65\" (UniqueName: \"kubernetes.io/projected/a6ece008-1cf5-4646-9354-1111778d622b-kube-api-access-bdw65\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.658060 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ecc75342-1be7-4279-95ad-c14e71294a6e-iptables-alerter-script\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.658060 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.657989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-device-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.658060 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.658060 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-system-cni-dir\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84z6v\" (UniqueName: \"kubernetes.io/projected/075172fd-6f0b-45b8-8765-5b6397bdb2b8-kube-api-access-84z6v\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658138 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ctw9\" (UniqueName: \"kubernetes.io/projected/ecc75342-1be7-4279-95ad-c14e71294a6e-kube-api-access-6ctw9\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-sys-fs\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a6ece008-1cf5-4646-9354-1111778d622b-etc-tuned\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/075172fd-6f0b-45b8-8765-5b6397bdb2b8-host\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19201734-1263-46e9-b401-4768c56c505c-tmp-dir\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-socket-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.658331 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658316 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-registration-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.658754 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658528 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-os-release\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.658754 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysconfig\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.658754 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658671 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysctl-conf\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.658754 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658714 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-sys\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.658972 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658786 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecc75342-1be7-4279-95ad-c14e71294a6e-host-slash\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.658972 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658822 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrjs7\" (UniqueName: \"kubernetes.io/projected/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-kube-api-access-nrjs7\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.658972 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.658880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cnibin\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.659547 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.659522 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 18:49:18.659920 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.659900 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sfft2\"" Apr 17 18:49:18.660055 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.660009 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 18:49:18.660640 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.660624 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 18:49:18.660721 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.660661 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 18:49:18.660833 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.660808 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 18:49:18.660914 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.660871 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 18:49:18.683887 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.683853 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:44:17 +0000 UTC" deadline="2027-10-13 15:22:29.734828128 +0000 UTC" Apr 17 18:49:18.683887 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.683886 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13052h33m11.050945398s" Apr 17 18:49:18.749130 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.749106 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 18:49:18.759196 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84z6v\" (UniqueName: \"kubernetes.io/projected/075172fd-6f0b-45b8-8765-5b6397bdb2b8-kube-api-access-84z6v\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.759321 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-sys-fs\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.759321 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759280 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-sys-fs\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.759432 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a6ece008-1cf5-4646-9354-1111778d622b-etc-tuned\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.759432 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-cni-netd\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.759432 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-k8s-cni-cncf-io\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.759569 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-hostroot\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.759569 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-etc-kubernetes\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.759569 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19201734-1263-46e9-b401-4768c56c505c-tmp-dir\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.759569 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759514 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-socket-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.759569 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-os-release\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.759827 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-run-netns\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.759827 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-sys\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.759827 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-os-release\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.759827 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759662 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-socket-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.759827 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759703 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 18:49:18.759827 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759734 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-sys\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.759827 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-daemon-config\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.759827 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecc75342-1be7-4279-95ad-c14e71294a6e-host-slash\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrjs7\" (UniqueName: \"kubernetes.io/projected/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-kube-api-access-nrjs7\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cnibin\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19201734-1263-46e9-b401-4768c56c505c-tmp-dir\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-lib-modules\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759954 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-var-lib-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-env-overrides\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.759998 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-kubelet\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-lib-modules\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-node-log\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cnibin\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.760090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2549m\" (UniqueName: \"kubernetes.io/projected/19201734-1263-46e9-b401-4768c56c505c-kube-api-access-2549m\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysctl-d\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ecc75342-1be7-4279-95ad-c14e71294a6e-host-slash\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760171 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-run-ovn-kubernetes\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760212 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-ovnkube-script-lib\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysctl-d\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-cnibin\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-multus-certs\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-modprobe-d\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-run\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760425 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760367 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-cni-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760410 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrshn\" (UniqueName: \"kubernetes.io/projected/6e2d8622-7004-4d8f-9297-ccace6582a00-kube-api-access-hrshn\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdw65\" (UniqueName: \"kubernetes.io/projected/a6ece008-1cf5-4646-9354-1111778d622b-kube-api-access-bdw65\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ecc75342-1be7-4279-95ad-c14e71294a6e-iptables-alerter-script\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-modprobe-d\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760584 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-systemd\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-socket-dir-parent\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ctw9\" (UniqueName: \"kubernetes.io/projected/ecc75342-1be7-4279-95ad-c14e71294a6e-kube-api-access-6ctw9\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-run\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-cni-bin\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760714 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-cni-multus\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/075172fd-6f0b-45b8-8765-5b6397bdb2b8-host\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-registration-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysconfig\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysctl-conf\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7q95\" (UniqueName: \"kubernetes.io/projected/9e29f722-7b28-401a-9488-46ff42062854-kube-api-access-n7q95\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:18.760941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760922 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-cni-bin\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-etc-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-ovnkube-config\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.760993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rm29\" (UniqueName: \"kubernetes.io/projected/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-kube-api-access-9rm29\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-kubernetes\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761341 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/075172fd-6f0b-45b8-8765-5b6397bdb2b8-host\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761386 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-registration-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysconfig\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-sysctl-conf\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ecc75342-1be7-4279-95ad-c14e71294a6e-iptables-alerter-script\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a6ece008-1cf5-4646-9354-1111778d622b-tmp\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761597 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-systemd-units\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761653 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91638f07-e924-407f-bb78-79ea02748faa-ovn-node-metrics-cert\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-netns\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.761747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/19201734-1263-46e9-b401-4768c56c505c-hosts-file\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761753 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-kubernetes\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-host\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-ovn\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761869 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-log-socket\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-os-release\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e2d8622-7004-4d8f-9297-ccace6582a00-cni-binary-copy\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-conf-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761951 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-host\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7681ddb2-af4c-468a-93b5-9e7d47992b0f-agent-certs\") pod \"konnectivity-agent-j7fsm\" (UID: \"7681ddb2-af4c-468a-93b5-9e7d47992b0f\") " pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-systemd\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-var-lib-kubelet\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7681ddb2-af4c-468a-93b5-9e7d47992b0f-konnectivity-ca\") pod \"konnectivity-agent-j7fsm\" (UID: \"7681ddb2-af4c-468a-93b5-9e7d47992b0f\") " pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/075172fd-6f0b-45b8-8765-5b6397bdb2b8-serviceca\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.762498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-device-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-system-cni-dir\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-kubelet\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-slash\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762440 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpdw5\" (UniqueName: \"kubernetes.io/projected/91638f07-e924-407f-bb78-79ea02748faa-kube-api-access-lpdw5\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762469 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-system-cni-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.761845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/19201734-1263-46e9-b401-4768c56c505c-hosts-file\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-device-dir\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762897 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-etc-systemd\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.762941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6ece008-1cf5-4646-9354-1111778d622b-var-lib-kubelet\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.763136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.763275 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.763275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.763975 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.763367 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-etc-selinux\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.763975 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.763392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/075172fd-6f0b-45b8-8765-5b6397bdb2b8-serviceca\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.763975 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.763530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-system-cni-dir\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.763975 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.763952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7681ddb2-af4c-468a-93b5-9e7d47992b0f-konnectivity-ca\") pod \"konnectivity-agent-j7fsm\" (UID: \"7681ddb2-af4c-468a-93b5-9e7d47992b0f\") " pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:18.764255 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.764238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a6ece008-1cf5-4646-9354-1111778d622b-tmp\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.764340 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.764304 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-cni-binary-copy\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.765604 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.765201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7681ddb2-af4c-468a-93b5-9e7d47992b0f-agent-certs\") pod \"konnectivity-agent-j7fsm\" (UID: \"7681ddb2-af4c-468a-93b5-9e7d47992b0f\") " pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:18.766852 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.766825 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84z6v\" (UniqueName: \"kubernetes.io/projected/075172fd-6f0b-45b8-8765-5b6397bdb2b8-kube-api-access-84z6v\") pod \"node-ca-n7gs7\" (UID: \"075172fd-6f0b-45b8-8765-5b6397bdb2b8\") " pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.767247 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.767222 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a6ece008-1cf5-4646-9354-1111778d622b-etc-tuned\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.768958 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.768887 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ctw9\" (UniqueName: \"kubernetes.io/projected/ecc75342-1be7-4279-95ad-c14e71294a6e-kube-api-access-6ctw9\") pod \"iptables-alerter-m7v2p\" (UID: \"ecc75342-1be7-4279-95ad-c14e71294a6e\") " pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.770273 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.770024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrjs7\" (UniqueName: \"kubernetes.io/projected/fcd0f83b-ece2-4fbc-9deb-af086f5f39d6-kube-api-access-nrjs7\") pod \"aws-ebs-csi-driver-node-kbc5z\" (UID: \"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.771958 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.771429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2549m\" (UniqueName: \"kubernetes.io/projected/19201734-1263-46e9-b401-4768c56c505c-kube-api-access-2549m\") pod \"node-resolver-fzlng\" (UID: \"19201734-1263-46e9-b401-4768c56c505c\") " pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.772803 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.772567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rm29\" (UniqueName: \"kubernetes.io/projected/b985cdb5-8694-44dd-aa5f-2770ef10e3c4-kube-api-access-9rm29\") pod \"multus-additional-cni-plugins-4xm2n\" (UID: \"b985cdb5-8694-44dd-aa5f-2770ef10e3c4\") " pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:18.772886 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.772846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdw65\" (UniqueName: \"kubernetes.io/projected/a6ece008-1cf5-4646-9354-1111778d622b-kube-api-access-bdw65\") pod \"tuned-t2r2f\" (UID: \"a6ece008-1cf5-4646-9354-1111778d622b\") " pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.791374 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.791327 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" event={"ID":"f6a008543323d6e8e1d6e408e1161d0c","Type":"ContainerStarted","Data":"7b6e7cbef8b5ee579443f5163991303a7a46fb4d0da5142916c09b04d67ea1ad"} Apr 17 18:49:18.792376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.792350 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" event={"ID":"de49da82ba182640048bc9ceae0a365f","Type":"ContainerStarted","Data":"78b29cbbf012405669a867cf82e875a5a7c5fb2450e7932c6452c6b671eabb55"} Apr 17 18:49:18.816561 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.816538 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:18.863591 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7q95\" (UniqueName: \"kubernetes.io/projected/9e29f722-7b28-401a-9488-46ff42062854-kube-api-access-n7q95\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-cni-bin\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-etc-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863680 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-ovnkube-config\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-systemd-units\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863732 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863753 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-etc-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.863761 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91638f07-e924-407f-bb78-79ea02748faa-ovn-node-metrics-cert\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-netns\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863812 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-cni-bin\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863822 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-ovn\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-log-socket\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-os-release\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e2d8622-7004-4d8f-9297-ccace6582a00-cni-binary-copy\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-conf-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863929 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-kubelet\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-slash\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.863978 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-slash\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-os-release\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpdw5\" (UniqueName: \"kubernetes.io/projected/91638f07-e924-407f-bb78-79ea02748faa-kube-api-access-lpdw5\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-system-cni-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-cni-netd\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-k8s-cni-cncf-io\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864219 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-hostroot\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-etc-kubernetes\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-run-netns\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-daemon-config\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-var-lib-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864286 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-ovnkube-config\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864317 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-env-overrides\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864334 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-var-lib-openvswitch\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864347 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-kubelet\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-node-log\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e2d8622-7004-4d8f-9297-ccace6582a00-cni-binary-copy\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864381 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-run-ovn-kubernetes\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864412 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-ovnkube-script-lib\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-conf-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864429 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-cnibin\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-multus-certs\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864460 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-cni-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.864971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrshn\" (UniqueName: \"kubernetes.io/projected/6e2d8622-7004-4d8f-9297-ccace6582a00-kube-api-access-hrshn\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864498 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-systemd\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864514 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-socket-dir-parent\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-cni-bin\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-cni-multus\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864603 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-kubelet\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-systemd-units\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-cni-multus\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-multus-certs\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864685 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-kubelet\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864696 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-cnibin\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-systemd\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864754 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-socket-dir-parent\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864790 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-cni-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-var-lib-cni-bin\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:18.864909 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.864994 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-env-overrides\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-run-ovn-kubernetes\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.865620 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:18.865044 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs podName:9e29f722-7b28-401a-9488-46ff42062854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:19.364962913 +0000 UTC m=+3.137707303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs") pod "network-metrics-daemon-dpqmj" (UID: "9e29f722-7b28-401a-9488-46ff42062854") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-cni-netd\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-hostroot\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-netns\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-system-cni-dir\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-node-log\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865247 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-etc-kubernetes\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865296 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-host-run-netns\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-run-ovn\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6e2d8622-7004-4d8f-9297-ccace6582a00-host-run-k8s-cni-cncf-io\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91638f07-e924-407f-bb78-79ea02748faa-log-socket\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91638f07-e924-407f-bb78-79ea02748faa-ovnkube-script-lib\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.866376 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.865836 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6e2d8622-7004-4d8f-9297-ccace6582a00-multus-daemon-config\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.870251 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.870225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91638f07-e924-407f-bb78-79ea02748faa-ovn-node-metrics-cert\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.871012 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:18.870987 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:18.871012 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:18.871016 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:18.871012 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:18.871029 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rwwpv for pod openshift-network-diagnostics/network-check-target-4wxbg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:18.871206 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:18.871105 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv podName:240834a9-2dd7-4a8c-8c31-3bcd8ec75854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:19.371084994 +0000 UTC m=+3.143829395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rwwpv" (UniqueName: "kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv") pod "network-check-target-4wxbg" (UID: "240834a9-2dd7-4a8c-8c31-3bcd8ec75854") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:18.873523 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.873482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7q95\" (UniqueName: \"kubernetes.io/projected/9e29f722-7b28-401a-9488-46ff42062854-kube-api-access-n7q95\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:18.873667 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.873644 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpdw5\" (UniqueName: \"kubernetes.io/projected/91638f07-e924-407f-bb78-79ea02748faa-kube-api-access-lpdw5\") pod \"ovnkube-node-rj69g\" (UID: \"91638f07-e924-407f-bb78-79ea02748faa\") " pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:18.873667 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.873657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrshn\" (UniqueName: \"kubernetes.io/projected/6e2d8622-7004-4d8f-9297-ccace6582a00-kube-api-access-hrshn\") pod \"multus-xc74x\" (UID: \"6e2d8622-7004-4d8f-9297-ccace6582a00\") " pod="openshift-multus/multus-xc74x" Apr 17 18:49:18.955271 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.955199 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n7gs7" Apr 17 18:49:18.963087 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.963063 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fzlng" Apr 17 18:49:18.969860 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.969833 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:18.975417 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.975390 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" Apr 17 18:49:18.981978 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.981959 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-m7v2p" Apr 17 18:49:18.989527 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.989501 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" Apr 17 18:49:18.996094 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:18.996077 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xc74x" Apr 17 18:49:19.003666 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.003650 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" Apr 17 18:49:19.010302 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.010286 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:19.134986 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.134949 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:49:19.368253 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.368220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:19.368427 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:19.368382 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:19.368506 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:19.368458 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs podName:9e29f722-7b28-401a-9488-46ff42062854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:20.368436429 +0000 UTC m=+4.141180816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs") pod "network-metrics-daemon-dpqmj" (UID: "9e29f722-7b28-401a-9488-46ff42062854") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:19.438449 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.438016 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod075172fd_6f0b_45b8_8765_5b6397bdb2b8.slice/crio-bf59cdb41b102500cf8bda9a1f96f61edf4033b48b6ef1cf4635208e8fa52593 WatchSource:0}: Error finding container bf59cdb41b102500cf8bda9a1f96f61edf4033b48b6ef1cf4635208e8fa52593: Status 404 returned error can't find the container with id bf59cdb41b102500cf8bda9a1f96f61edf4033b48b6ef1cf4635208e8fa52593 Apr 17 18:49:19.442394 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.442365 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb985cdb5_8694_44dd_aa5f_2770ef10e3c4.slice/crio-b79131d8ec03d27546cc79d0968faf7d48c0531da67287e387e43e223d2d92a9 WatchSource:0}: Error finding container b79131d8ec03d27546cc79d0968faf7d48c0531da67287e387e43e223d2d92a9: Status 404 returned error can't find the container with id b79131d8ec03d27546cc79d0968faf7d48c0531da67287e387e43e223d2d92a9 Apr 17 18:49:19.443327 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.443303 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91638f07_e924_407f_bb78_79ea02748faa.slice/crio-d5ef8296d92d46edb91b9beab3a5473d9097f5d28b540f10823c95023c57f1f6 WatchSource:0}: Error finding container d5ef8296d92d46edb91b9beab3a5473d9097f5d28b540f10823c95023c57f1f6: Status 404 returned error can't find the container with id d5ef8296d92d46edb91b9beab3a5473d9097f5d28b540f10823c95023c57f1f6 Apr 17 18:49:19.444417 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.444306 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e2d8622_7004_4d8f_9297_ccace6582a00.slice/crio-a338ce3699fc734044fb9081a6d6d4111413b41e46635ca8cb0aa3385606082b WatchSource:0}: Error finding container a338ce3699fc734044fb9081a6d6d4111413b41e46635ca8cb0aa3385606082b: Status 404 returned error can't find the container with id a338ce3699fc734044fb9081a6d6d4111413b41e46635ca8cb0aa3385606082b Apr 17 18:49:19.445177 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.445157 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecc75342_1be7_4279_95ad_c14e71294a6e.slice/crio-ca9a5fd25fef1f17b4290cd89bf6ff947a39e89d69bc8b28383927da7ca7cc6e WatchSource:0}: Error finding container ca9a5fd25fef1f17b4290cd89bf6ff947a39e89d69bc8b28383927da7ca7cc6e: Status 404 returned error can't find the container with id ca9a5fd25fef1f17b4290cd89bf6ff947a39e89d69bc8b28383927da7ca7cc6e Apr 17 18:49:19.446486 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.446140 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ece008_1cf5_4646_9354_1111778d622b.slice/crio-bf60c225cb8cb6454bbd83eceff6da614b0b7f3163eac66fb66f5f0351db768f WatchSource:0}: Error finding container bf60c225cb8cb6454bbd83eceff6da614b0b7f3163eac66fb66f5f0351db768f: Status 404 returned error can't find the container with id bf60c225cb8cb6454bbd83eceff6da614b0b7f3163eac66fb66f5f0351db768f Apr 17 18:49:19.448123 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.448098 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcd0f83b_ece2_4fbc_9deb_af086f5f39d6.slice/crio-6571113c531daef16ff6ed15df9b659d9da210049a6581f05ebc1bbff6ef6346 WatchSource:0}: Error finding container 6571113c531daef16ff6ed15df9b659d9da210049a6581f05ebc1bbff6ef6346: Status 404 returned error can't find the container with id 6571113c531daef16ff6ed15df9b659d9da210049a6581f05ebc1bbff6ef6346 Apr 17 18:49:19.449567 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.449415 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7681ddb2_af4c_468a_93b5_9e7d47992b0f.slice/crio-a647b1d9b4d55a81d393e871381f30019c873d5b12a7409259339993968ffa41 WatchSource:0}: Error finding container a647b1d9b4d55a81d393e871381f30019c873d5b12a7409259339993968ffa41: Status 404 returned error can't find the container with id a647b1d9b4d55a81d393e871381f30019c873d5b12a7409259339993968ffa41 Apr 17 18:49:19.450902 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:19.450857 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19201734_1263_46e9_b401_4768c56c505c.slice/crio-36c4d078f939b1912668537073ba12bfad9148a37529bdc199fdeab31332bbde WatchSource:0}: Error finding container 36c4d078f939b1912668537073ba12bfad9148a37529bdc199fdeab31332bbde: Status 404 returned error can't find the container with id 36c4d078f939b1912668537073ba12bfad9148a37529bdc199fdeab31332bbde Apr 17 18:49:19.469162 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.469133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:19.469300 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:19.469278 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:19.469300 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:19.469303 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:19.469409 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:19.469313 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rwwpv for pod openshift-network-diagnostics/network-check-target-4wxbg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:19.469409 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:19.469365 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv podName:240834a9-2dd7-4a8c-8c31-3bcd8ec75854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:20.469347156 +0000 UTC m=+4.242091556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwwpv" (UniqueName: "kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv") pod "network-check-target-4wxbg" (UID: "240834a9-2dd7-4a8c-8c31-3bcd8ec75854") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:19.684614 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.684340 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:44:17 +0000 UTC" deadline="2027-09-22 12:19:59.22513973 +0000 UTC" Apr 17 18:49:19.684614 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.684547 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12545h30m39.540596871s" Apr 17 18:49:19.805055 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.804992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m7v2p" event={"ID":"ecc75342-1be7-4279-95ad-c14e71294a6e","Type":"ContainerStarted","Data":"ca9a5fd25fef1f17b4290cd89bf6ff947a39e89d69bc8b28383927da7ca7cc6e"} Apr 17 18:49:19.816414 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.816378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"d5ef8296d92d46edb91b9beab3a5473d9097f5d28b540f10823c95023c57f1f6"} Apr 17 18:49:19.818196 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.818171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerStarted","Data":"b79131d8ec03d27546cc79d0968faf7d48c0531da67287e387e43e223d2d92a9"} Apr 17 18:49:19.820592 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.820568 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" event={"ID":"de49da82ba182640048bc9ceae0a365f","Type":"ContainerStarted","Data":"65fb3b7ed9a043c3180824be1360072d054a3603f2879b1230adeabe83788b83"} Apr 17 18:49:19.825975 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.825948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fzlng" event={"ID":"19201734-1263-46e9-b401-4768c56c505c","Type":"ContainerStarted","Data":"36c4d078f939b1912668537073ba12bfad9148a37529bdc199fdeab31332bbde"} Apr 17 18:49:19.828108 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.828078 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" event={"ID":"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6","Type":"ContainerStarted","Data":"6571113c531daef16ff6ed15df9b659d9da210049a6581f05ebc1bbff6ef6346"} Apr 17 18:49:19.831980 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.831942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" event={"ID":"a6ece008-1cf5-4646-9354-1111778d622b","Type":"ContainerStarted","Data":"bf60c225cb8cb6454bbd83eceff6da614b0b7f3163eac66fb66f5f0351db768f"} Apr 17 18:49:19.834452 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.834407 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-118.ec2.internal" podStartSLOduration=2.834370496 podStartE2EDuration="2.834370496s" podCreationTimestamp="2026-04-17 18:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:49:19.833542017 +0000 UTC m=+3.606286425" watchObservedRunningTime="2026-04-17 18:49:19.834370496 +0000 UTC m=+3.607114905" Apr 17 18:49:19.837867 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.837845 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xc74x" event={"ID":"6e2d8622-7004-4d8f-9297-ccace6582a00","Type":"ContainerStarted","Data":"a338ce3699fc734044fb9081a6d6d4111413b41e46635ca8cb0aa3385606082b"} Apr 17 18:49:19.842175 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.842150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n7gs7" event={"ID":"075172fd-6f0b-45b8-8765-5b6397bdb2b8","Type":"ContainerStarted","Data":"bf59cdb41b102500cf8bda9a1f96f61edf4033b48b6ef1cf4635208e8fa52593"} Apr 17 18:49:19.843724 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:19.843700 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j7fsm" event={"ID":"7681ddb2-af4c-468a-93b5-9e7d47992b0f","Type":"ContainerStarted","Data":"a647b1d9b4d55a81d393e871381f30019c873d5b12a7409259339993968ffa41"} Apr 17 18:49:20.376269 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:20.376235 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:20.376421 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:20.376382 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:20.376478 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:20.376438 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs podName:9e29f722-7b28-401a-9488-46ff42062854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:22.376418604 +0000 UTC m=+6.149163013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs") pod "network-metrics-daemon-dpqmj" (UID: "9e29f722-7b28-401a-9488-46ff42062854") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:20.477909 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:20.477217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:20.477909 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:20.477396 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:20.477909 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:20.477417 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:20.477909 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:20.477430 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rwwpv for pod openshift-network-diagnostics/network-check-target-4wxbg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:20.477909 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:20.477489 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv podName:240834a9-2dd7-4a8c-8c31-3bcd8ec75854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:22.477470612 +0000 UTC m=+6.250215013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwwpv" (UniqueName: "kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv") pod "network-check-target-4wxbg" (UID: "240834a9-2dd7-4a8c-8c31-3bcd8ec75854") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:20.790542 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:20.790464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:20.791017 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:20.790587 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:20.791017 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:20.790698 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:20.791017 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:20.790817 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:20.854530 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:20.854494 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6a008543323d6e8e1d6e408e1161d0c" containerID="b1f04263479b54e9502c6a2b37c6f887ff8851b5956b1a25ec28ea3af8d81f3b" exitCode=0 Apr 17 18:49:20.855428 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:20.855387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" event={"ID":"f6a008543323d6e8e1d6e408e1161d0c","Type":"ContainerDied","Data":"b1f04263479b54e9502c6a2b37c6f887ff8851b5956b1a25ec28ea3af8d81f3b"} Apr 17 18:49:21.449236 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.449153 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bq86n"] Apr 17 18:49:21.451002 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.450977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.451143 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:21.451062 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:21.483811 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.483756 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-kubelet-config\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.483985 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.483873 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.483985 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.483940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-dbus\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.584933 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.584878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-kubelet-config\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.584933 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.584931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.585161 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.584977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-dbus\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.585217 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.585167 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-dbus\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.585276 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.585228 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-kubelet-config\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:21.585379 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:21.585332 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:21.585434 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:21.585393 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret podName:4cde13d3-2f3b-4ae9-b90e-00369cefc3cf nodeName:}" failed. No retries permitted until 2026-04-17 18:49:22.085372872 +0000 UTC m=+5.858117261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret") pod "global-pull-secret-syncer-bq86n" (UID: "4cde13d3-2f3b-4ae9-b90e-00369cefc3cf") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:21.875674 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:21.875613 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" event={"ID":"f6a008543323d6e8e1d6e408e1161d0c","Type":"ContainerStarted","Data":"83d895094c5dd0b662fff91f98c111c305434cfd466a540c215c91c3ee668045"} Apr 17 18:49:22.089111 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:22.089062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:22.089287 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.089200 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:22.089287 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.089261 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret podName:4cde13d3-2f3b-4ae9-b90e-00369cefc3cf nodeName:}" failed. No retries permitted until 2026-04-17 18:49:23.089242216 +0000 UTC m=+6.861986611 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret") pod "global-pull-secret-syncer-bq86n" (UID: "4cde13d3-2f3b-4ae9-b90e-00369cefc3cf") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:22.392599 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:22.392563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:22.392787 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.392731 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:22.392847 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.392806 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs podName:9e29f722-7b28-401a-9488-46ff42062854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:26.39278838 +0000 UTC m=+10.165532771 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs") pod "network-metrics-daemon-dpqmj" (UID: "9e29f722-7b28-401a-9488-46ff42062854") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:22.493872 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:22.493832 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:22.494058 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.494019 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:22.494058 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.494038 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:22.494058 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.494050 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rwwpv for pod openshift-network-diagnostics/network-check-target-4wxbg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:22.494235 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.494106 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv podName:240834a9-2dd7-4a8c-8c31-3bcd8ec75854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:26.494089043 +0000 UTC m=+10.266833432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwwpv" (UniqueName: "kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv") pod "network-check-target-4wxbg" (UID: "240834a9-2dd7-4a8c-8c31-3bcd8ec75854") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:22.788521 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:22.788003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:22.788521 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:22.788137 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:22.788521 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.788145 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:22.788521 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.788295 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:22.788521 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:22.788306 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:22.788521 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:22.788408 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:23.099439 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:23.099400 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:23.100015 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:23.099565 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:23.100015 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:23.099641 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret podName:4cde13d3-2f3b-4ae9-b90e-00369cefc3cf nodeName:}" failed. No retries permitted until 2026-04-17 18:49:25.099619646 +0000 UTC m=+8.872364038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret") pod "global-pull-secret-syncer-bq86n" (UID: "4cde13d3-2f3b-4ae9-b90e-00369cefc3cf") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:24.788930 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:24.788898 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:24.789457 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:24.788942 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:24.789457 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:24.788898 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:24.789457 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:24.789043 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:24.789457 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:24.789138 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:24.789457 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:24.789220 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:25.117695 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:25.117657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:25.117921 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:25.117837 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:25.117921 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:25.117901 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret podName:4cde13d3-2f3b-4ae9-b90e-00369cefc3cf nodeName:}" failed. No retries permitted until 2026-04-17 18:49:29.117882905 +0000 UTC m=+12.890627307 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret") pod "global-pull-secret-syncer-bq86n" (UID: "4cde13d3-2f3b-4ae9-b90e-00369cefc3cf") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:26.427887 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:26.427855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:26.428374 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.427988 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:26.428374 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.428045 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs podName:9e29f722-7b28-401a-9488-46ff42062854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:34.428030918 +0000 UTC m=+18.200775330 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs") pod "network-metrics-daemon-dpqmj" (UID: "9e29f722-7b28-401a-9488-46ff42062854") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:26.529101 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:26.529066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:26.529241 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.529192 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:26.529241 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.529207 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:26.529241 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.529216 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rwwpv for pod openshift-network-diagnostics/network-check-target-4wxbg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:26.529342 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.529260 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv podName:240834a9-2dd7-4a8c-8c31-3bcd8ec75854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:34.529245735 +0000 UTC m=+18.301990120 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwwpv" (UniqueName: "kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv") pod "network-check-target-4wxbg" (UID: "240834a9-2dd7-4a8c-8c31-3bcd8ec75854") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:26.790535 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:26.789741 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:26.790535 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.789887 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:26.790535 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:26.789946 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:26.790535 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.790011 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:26.790535 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:26.790366 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:26.790535 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:26.790449 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:28.791120 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:28.790156 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:28.791120 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:28.790293 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:28.791120 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:28.790660 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:28.791120 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:28.790795 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:28.791120 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:28.790874 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:28.791120 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:28.790933 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:29.150333 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:29.150298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:29.150517 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:29.150476 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:29.150561 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:29.150553 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret podName:4cde13d3-2f3b-4ae9-b90e-00369cefc3cf nodeName:}" failed. No retries permitted until 2026-04-17 18:49:37.150530748 +0000 UTC m=+20.923275138 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret") pod "global-pull-secret-syncer-bq86n" (UID: "4cde13d3-2f3b-4ae9-b90e-00369cefc3cf") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:30.788524 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:30.788489 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:30.788992 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:30.788491 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:30.788992 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:30.788603 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:30.788992 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:30.788619 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:30.788992 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:30.788735 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:30.788992 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:30.788831 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:32.788349 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:32.788305 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:32.788349 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:32.788341 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:32.788898 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:32.788368 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:32.788898 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:32.788536 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:32.788898 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:32.788600 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:32.788898 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:32.788664 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:34.488341 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:34.488307 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:34.488815 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.488451 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:34.488815 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.488505 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs podName:9e29f722-7b28-401a-9488-46ff42062854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:50.488491219 +0000 UTC m=+34.261235609 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs") pod "network-metrics-daemon-dpqmj" (UID: "9e29f722-7b28-401a-9488-46ff42062854") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:34.589598 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:34.589553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:34.589798 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.589754 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:34.589881 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.589801 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:34.589881 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.589816 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rwwpv for pod openshift-network-diagnostics/network-check-target-4wxbg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:34.589972 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.589890 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv podName:240834a9-2dd7-4a8c-8c31-3bcd8ec75854 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:50.589868695 +0000 UTC m=+34.362613082 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwwpv" (UniqueName: "kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv") pod "network-check-target-4wxbg" (UID: "240834a9-2dd7-4a8c-8c31-3bcd8ec75854") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:34.788781 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:34.788682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:34.788943 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:34.788682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:34.788943 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.788818 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:34.788943 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.788913 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:34.788943 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:34.788682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:34.789146 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:34.789025 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:36.790810 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.788634 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:36.790810 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:36.788738 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:36.790810 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.789181 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:36.790810 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:36.789278 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:36.790810 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.789336 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:36.790810 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:36.789407 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:36.906038 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.905801 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"82f7ae7ba69a148451691263a935841349b8636ee32ea842d2721e738e1e28f2"} Apr 17 18:49:36.907719 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.907641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerStarted","Data":"ebd607fb61b443953560580997ff87d9e13398e488552aeeaa1fc84b47f8e7c8"} Apr 17 18:49:36.910747 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.910432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" event={"ID":"a6ece008-1cf5-4646-9354-1111778d622b","Type":"ContainerStarted","Data":"71e371d0195ac455fadc67971adc0483fca2c491d769438c561d9f649a5bd7a1"} Apr 17 18:49:36.912820 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.912575 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xc74x" event={"ID":"6e2d8622-7004-4d8f-9297-ccace6582a00","Type":"ContainerStarted","Data":"d09aab497411c2c6886541add4c9d989195551dc35d763f3561401eceb516de7"} Apr 17 18:49:36.915898 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.915870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n7gs7" event={"ID":"075172fd-6f0b-45b8-8765-5b6397bdb2b8","Type":"ContainerStarted","Data":"595c9641ed793e7be2608fe09323f8d2e533c4cbb3a5d72f190afe4dce1d7d20"} Apr 17 18:49:36.930317 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.930265 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-118.ec2.internal" podStartSLOduration=19.930248448 podStartE2EDuration="19.930248448s" podCreationTimestamp="2026-04-17 18:49:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:49:21.890689483 +0000 UTC m=+5.663433889" watchObservedRunningTime="2026-04-17 18:49:36.930248448 +0000 UTC m=+20.702992857" Apr 17 18:49:36.942644 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.942603 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t2r2f" podStartSLOduration=3.724761338 podStartE2EDuration="20.942589698s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.448695504 +0000 UTC m=+3.221439905" lastFinishedPulling="2026-04-17 18:49:36.66652388 +0000 UTC m=+20.439268265" observedRunningTime="2026-04-17 18:49:36.942145417 +0000 UTC m=+20.714889828" watchObservedRunningTime="2026-04-17 18:49:36.942589698 +0000 UTC m=+20.715334106" Apr 17 18:49:36.956634 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:36.956577 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xc74x" podStartSLOduration=3.734598606 podStartE2EDuration="20.956558192s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.446367934 +0000 UTC m=+3.219112336" lastFinishedPulling="2026-04-17 18:49:36.668327522 +0000 UTC m=+20.441071922" observedRunningTime="2026-04-17 18:49:36.955894576 +0000 UTC m=+20.728638996" watchObservedRunningTime="2026-04-17 18:49:36.956558192 +0000 UTC m=+20.729302601" Apr 17 18:49:37.210973 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.210806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:37.211083 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:37.210951 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:37.211083 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:37.211081 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret podName:4cde13d3-2f3b-4ae9-b90e-00369cefc3cf nodeName:}" failed. No retries permitted until 2026-04-17 18:49:53.211066849 +0000 UTC m=+36.983811240 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret") pod "global-pull-secret-syncer-bq86n" (UID: "4cde13d3-2f3b-4ae9-b90e-00369cefc3cf") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:49:37.919301 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.919226 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-j7fsm" event={"ID":"7681ddb2-af4c-468a-93b5-9e7d47992b0f","Type":"ContainerStarted","Data":"02560153e1802e1a9266c6c32e18d2e487ae3dd6da5fda7226b9d4fbb42e8181"} Apr 17 18:49:37.921617 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.921596 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 18:49:37.921889 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.921867 2574 generic.go:358] "Generic (PLEG): container finished" podID="91638f07-e924-407f-bb78-79ea02748faa" containerID="dd647619c169f42812ba3fb5196dd9bdf62889566b405fac379289e34492e113" exitCode=1 Apr 17 18:49:37.921955 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.921900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"cc2995573bcc29a034311e335bd0ee21f9a38834752041968af79ae2b94ca3cd"} Apr 17 18:49:37.921955 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.921935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"c39028fb7b544ad784a0c921c1543f0954f246b426fc3d58a38a9af161fab810"} Apr 17 18:49:37.921955 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.921949 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"d7dbc32c61916c85f4380d34b9c559f4fbb9bf3232dae3e928688996c62b2101"} Apr 17 18:49:37.922089 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.921961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"e50c56ac28174eb8d723f9d7d90b22084ed6b56d207071cc2d0e07a776a8469a"} Apr 17 18:49:37.922089 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.921972 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerDied","Data":"dd647619c169f42812ba3fb5196dd9bdf62889566b405fac379289e34492e113"} Apr 17 18:49:37.923177 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.923150 2574 generic.go:358] "Generic (PLEG): container finished" podID="b985cdb5-8694-44dd-aa5f-2770ef10e3c4" containerID="ebd607fb61b443953560580997ff87d9e13398e488552aeeaa1fc84b47f8e7c8" exitCode=0 Apr 17 18:49:37.923247 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.923201 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerDied","Data":"ebd607fb61b443953560580997ff87d9e13398e488552aeeaa1fc84b47f8e7c8"} Apr 17 18:49:37.924473 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.924447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fzlng" event={"ID":"19201734-1263-46e9-b401-4768c56c505c","Type":"ContainerStarted","Data":"6be3d82c82f2f949d93377bf0791d4d3b8ca0890cf6984ed466ed3147e5c753a"} Apr 17 18:49:37.925625 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.925601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" event={"ID":"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6","Type":"ContainerStarted","Data":"39085cccb90ae4586cfb620c831b09ff884fa0988a8db48b0c8adfe143ac8394"} Apr 17 18:49:37.936512 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.935258 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-j7fsm" podStartSLOduration=4.739515811 podStartE2EDuration="21.935241843s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.451295333 +0000 UTC m=+3.224039722" lastFinishedPulling="2026-04-17 18:49:36.647021368 +0000 UTC m=+20.419765754" observedRunningTime="2026-04-17 18:49:37.934212455 +0000 UTC m=+21.706956867" watchObservedRunningTime="2026-04-17 18:49:37.935241843 +0000 UTC m=+21.707986252" Apr 17 18:49:37.947098 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.947062 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n7gs7" podStartSLOduration=4.90133912 podStartE2EDuration="21.94705068s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.441235802 +0000 UTC m=+3.213980188" lastFinishedPulling="2026-04-17 18:49:36.486947359 +0000 UTC m=+20.259691748" observedRunningTime="2026-04-17 18:49:37.94704833 +0000 UTC m=+21.719792738" watchObservedRunningTime="2026-04-17 18:49:37.94705068 +0000 UTC m=+21.719795087" Apr 17 18:49:37.978133 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:37.978089 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fzlng" podStartSLOduration=4.943800822 podStartE2EDuration="21.97807699s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.452654982 +0000 UTC m=+3.225399371" lastFinishedPulling="2026-04-17 18:49:36.486931153 +0000 UTC m=+20.259675539" observedRunningTime="2026-04-17 18:49:37.978056554 +0000 UTC m=+21.750800961" watchObservedRunningTime="2026-04-17 18:49:37.97807699 +0000 UTC m=+21.750821397" Apr 17 18:49:38.432101 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.432077 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 18:49:38.734957 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.734825 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T18:49:38.432096943Z","UUID":"7ca2bf3b-9b86-46c3-a3f5-61abd434cef8","Handler":null,"Name":"","Endpoint":""} Apr 17 18:49:38.737917 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.737893 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 18:49:38.738052 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.737927 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 18:49:38.791041 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.791016 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:38.791206 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.791019 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:38.791206 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:38.791108 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:38.791206 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:38.791194 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:38.791312 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.791227 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:38.791312 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:38.791286 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:38.929507 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.929467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" event={"ID":"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6","Type":"ContainerStarted","Data":"fe34a878f48dfdd684dcd189f06be0f3f0f61aacf81e9b8130bc11e21b6977c4"} Apr 17 18:49:38.931107 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.931068 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-m7v2p" event={"ID":"ecc75342-1be7-4279-95ad-c14e71294a6e","Type":"ContainerStarted","Data":"25e00750fdeb99d6c7b605113dd93ada3b5e1333912886649bc925d55ecb10a8"} Apr 17 18:49:38.944872 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:38.944784 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-m7v2p" podStartSLOduration=5.821240067 podStartE2EDuration="22.944746543s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.44787217 +0000 UTC m=+3.220616571" lastFinishedPulling="2026-04-17 18:49:36.571378658 +0000 UTC m=+20.344123047" observedRunningTime="2026-04-17 18:49:38.944067826 +0000 UTC m=+22.716812258" watchObservedRunningTime="2026-04-17 18:49:38.944746543 +0000 UTC m=+22.717490955" Apr 17 18:49:39.936446 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:39.936276 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 18:49:39.936919 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:39.936880 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"8a753eca046dcda919b36abe7ca510975aa897b5242794a4cb9f3034565fcd06"} Apr 17 18:49:39.939756 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:39.939721 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" event={"ID":"fcd0f83b-ece2-4fbc-9deb-af086f5f39d6","Type":"ContainerStarted","Data":"b237a2642a19d29cbe4b61025134b867945ec943957206d2aedf81d8c7874c35"} Apr 17 18:49:39.956674 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:39.956628 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kbc5z" podStartSLOduration=3.7503538020000002 podStartE2EDuration="23.956616114s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.449843366 +0000 UTC m=+3.222587757" lastFinishedPulling="2026-04-17 18:49:39.656105683 +0000 UTC m=+23.428850069" observedRunningTime="2026-04-17 18:49:39.956285415 +0000 UTC m=+23.729029849" watchObservedRunningTime="2026-04-17 18:49:39.956616114 +0000 UTC m=+23.729360521" Apr 17 18:49:40.788049 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:40.788013 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:40.788268 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:40.788013 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:40.788268 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:40.788156 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:40.788268 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:40.788013 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:40.788268 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:40.788230 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:40.788526 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:40.788295 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:42.477547 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.477524 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:42.478287 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.478271 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:42.788312 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.788113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:42.788448 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.788123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:42.788448 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:42.788390 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:42.788553 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:42.788498 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:42.788553 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.788153 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:42.788647 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:42.788628 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:42.947831 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.947806 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 18:49:42.948166 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.948141 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"b0f8a954a8622c7a65fd37d7dc50e26f5bd98122233db4856c12bab09bed0f27"} Apr 17 18:49:42.948552 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.948528 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:42.948637 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.948560 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:42.948682 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.948641 2574 scope.go:117] "RemoveContainer" containerID="dd647619c169f42812ba3fb5196dd9bdf62889566b405fac379289e34492e113" Apr 17 18:49:42.950022 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.950000 2574 generic.go:358] "Generic (PLEG): container finished" podID="b985cdb5-8694-44dd-aa5f-2770ef10e3c4" containerID="84ca3e854e043eb3d93d741597f5beb48bd0b53c808d40d3f4aa2ac6e4aa1c2c" exitCode=0 Apr 17 18:49:42.950123 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.950027 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerDied","Data":"84ca3e854e043eb3d93d741597f5beb48bd0b53c808d40d3f4aa2ac6e4aa1c2c"} Apr 17 18:49:42.950299 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.950284 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:42.950797 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.950780 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-j7fsm" Apr 17 18:49:42.964871 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:42.964852 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:43.953838 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:43.953744 2574 generic.go:358] "Generic (PLEG): container finished" podID="b985cdb5-8694-44dd-aa5f-2770ef10e3c4" containerID="0ba8976a5f108f5a54ef2300f322d78cf760a4b829453b57cc5bef3c9505fa16" exitCode=0 Apr 17 18:49:43.953838 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:43.953805 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerDied","Data":"0ba8976a5f108f5a54ef2300f322d78cf760a4b829453b57cc5bef3c9505fa16"} Apr 17 18:49:43.957456 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:43.957435 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 18:49:43.957818 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:43.957792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" event={"ID":"91638f07-e924-407f-bb78-79ea02748faa","Type":"ContainerStarted","Data":"3eeab85e3284950efd1819382d8d686f0621026d42a2a6aa3cb5b02470e350ce"} Apr 17 18:49:43.958242 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:43.958217 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:43.973213 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:43.973189 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:49:44.441181 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.441118 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" podStartSLOduration=11.166014333 podStartE2EDuration="28.441094952s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.445338098 +0000 UTC m=+3.218082485" lastFinishedPulling="2026-04-17 18:49:36.720418703 +0000 UTC m=+20.493163104" observedRunningTime="2026-04-17 18:49:43.999419803 +0000 UTC m=+27.772164211" watchObservedRunningTime="2026-04-17 18:49:44.441094952 +0000 UTC m=+28.213839362" Apr 17 18:49:44.442334 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.442028 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bq86n"] Apr 17 18:49:44.442334 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.442161 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:44.442334 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:44.442267 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:44.442848 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.442818 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dpqmj"] Apr 17 18:49:44.443000 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.442973 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:44.443135 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:44.443093 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:44.443884 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.443861 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4wxbg"] Apr 17 18:49:44.443989 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.443969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:44.444060 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:44.444035 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:44.962098 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.961850 2574 generic.go:358] "Generic (PLEG): container finished" podID="b985cdb5-8694-44dd-aa5f-2770ef10e3c4" containerID="da91a8d353831a3492b7b0d26aca54a7a81a1860236c0d4b09cd3138a7d6bd1c" exitCode=0 Apr 17 18:49:44.962098 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:44.961898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerDied","Data":"da91a8d353831a3492b7b0d26aca54a7a81a1860236c0d4b09cd3138a7d6bd1c"} Apr 17 18:49:45.788348 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:45.788311 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:45.788520 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:45.788430 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:46.789065 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:46.788974 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:46.789065 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:46.789055 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:46.789638 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:46.789113 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:46.789638 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:46.789419 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:47.788785 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:47.788738 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:47.788945 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:47.788873 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:48.788582 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:48.788501 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:48.789134 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:48.788621 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4wxbg" podUID="240834a9-2dd7-4a8c-8c31-3bcd8ec75854" Apr 17 18:49:48.789134 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:48.788665 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:48.789134 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:48.788795 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bq86n" podUID="4cde13d3-2f3b-4ae9-b90e-00369cefc3cf" Apr 17 18:49:49.787854 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:49.787817 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:49.788158 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:49.787951 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dpqmj" podUID="9e29f722-7b28-401a-9488-46ff42062854" Apr 17 18:49:50.506478 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.506391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:50.507068 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.506555 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:50.507068 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.506633 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs podName:9e29f722-7b28-401a-9488-46ff42062854 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:22.50661381 +0000 UTC m=+66.279358216 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs") pod "network-metrics-daemon-dpqmj" (UID: "9e29f722-7b28-401a-9488-46ff42062854") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:49:50.582837 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.582803 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-118.ec2.internal" event="NodeReady" Apr 17 18:49:50.583020 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.582956 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 18:49:50.606940 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.606842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:50.607110 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.607010 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:49:50.607110 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.607030 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:49:50.607110 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.607044 2574 projected.go:194] Error preparing data for projected volume kube-api-access-rwwpv for pod openshift-network-diagnostics/network-check-target-4wxbg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:50.607110 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.607109 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv podName:240834a9-2dd7-4a8c-8c31-3bcd8ec75854 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:22.607090545 +0000 UTC m=+66.379834945 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rwwpv" (UniqueName: "kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv") pod "network-check-target-4wxbg" (UID: "240834a9-2dd7-4a8c-8c31-3bcd8ec75854") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:49:50.622280 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.622240 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-r7tlb"] Apr 17 18:49:50.655789 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.655727 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-q6kpz"] Apr 17 18:49:50.655952 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.655830 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.658382 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.658359 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 18:49:50.658535 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.658464 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 18:49:50.658535 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.658514 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctvd8\"" Apr 17 18:49:50.671914 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.671875 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r7tlb"] Apr 17 18:49:50.671914 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.671903 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q6kpz"] Apr 17 18:49:50.672046 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.672005 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:50.674217 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.674165 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ppj8p\"" Apr 17 18:49:50.674217 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.674207 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 18:49:50.674444 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.674410 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 18:49:50.674502 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.674452 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 18:49:50.788438 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.788353 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:49:50.788438 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.788382 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:50.791264 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.791238 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:49:50.791393 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.791322 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tlcmp\"" Apr 17 18:49:50.791393 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.791352 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 18:49:50.791393 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.791386 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:49:50.808484 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.808464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:50.808584 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.808509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3853ba23-4818-4e5c-adb0-a74c55faa515-config-volume\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.808584 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.808529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3853ba23-4818-4e5c-adb0-a74c55faa515-tmp-dir\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.808584 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.808560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwgtw\" (UniqueName: \"kubernetes.io/projected/02bba305-18ac-410d-ab1e-0abfaf32082a-kube-api-access-lwgtw\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:50.808683 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.808589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.808683 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.808630 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ztbd\" (UniqueName: \"kubernetes.io/projected/3853ba23-4818-4e5c-adb0-a74c55faa515-kube-api-access-6ztbd\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.909900 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.909864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3853ba23-4818-4e5c-adb0-a74c55faa515-config-volume\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.909900 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.909905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3853ba23-4818-4e5c-adb0-a74c55faa515-tmp-dir\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.910108 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.909943 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwgtw\" (UniqueName: \"kubernetes.io/projected/02bba305-18ac-410d-ab1e-0abfaf32082a-kube-api-access-lwgtw\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:50.910108 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.909976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.910108 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.910002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ztbd\" (UniqueName: \"kubernetes.io/projected/3853ba23-4818-4e5c-adb0-a74c55faa515-kube-api-access-6ztbd\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.910108 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.910031 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:50.910313 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.910116 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:50.910313 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.910128 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:50.910313 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.910188 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls podName:3853ba23-4818-4e5c-adb0-a74c55faa515 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:51.410168534 +0000 UTC m=+35.182912935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls") pod "dns-default-r7tlb" (UID: "3853ba23-4818-4e5c-adb0-a74c55faa515") : secret "dns-default-metrics-tls" not found Apr 17 18:49:50.910313 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:50.910207 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert podName:02bba305-18ac-410d-ab1e-0abfaf32082a nodeName:}" failed. No retries permitted until 2026-04-17 18:49:51.410198722 +0000 UTC m=+35.182943109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert") pod "ingress-canary-q6kpz" (UID: "02bba305-18ac-410d-ab1e-0abfaf32082a") : secret "canary-serving-cert" not found Apr 17 18:49:50.910313 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.910291 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3853ba23-4818-4e5c-adb0-a74c55faa515-tmp-dir\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.910542 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.910441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3853ba23-4818-4e5c-adb0-a74c55faa515-config-volume\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.920924 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.920900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ztbd\" (UniqueName: \"kubernetes.io/projected/3853ba23-4818-4e5c-adb0-a74c55faa515-kube-api-access-6ztbd\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:50.921129 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:50.921110 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwgtw\" (UniqueName: \"kubernetes.io/projected/02bba305-18ac-410d-ab1e-0abfaf32082a-kube-api-access-lwgtw\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:51.413068 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:51.413039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:51.413185 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:51.413082 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:51.413185 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:51.413180 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:51.413298 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:51.413191 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:51.413298 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:51.413224 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert podName:02bba305-18ac-410d-ab1e-0abfaf32082a nodeName:}" failed. No retries permitted until 2026-04-17 18:49:52.413210016 +0000 UTC m=+36.185954402 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert") pod "ingress-canary-q6kpz" (UID: "02bba305-18ac-410d-ab1e-0abfaf32082a") : secret "canary-serving-cert" not found Apr 17 18:49:51.413298 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:51.413252 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls podName:3853ba23-4818-4e5c-adb0-a74c55faa515 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:52.413232545 +0000 UTC m=+36.185976944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls") pod "dns-default-r7tlb" (UID: "3853ba23-4818-4e5c-adb0-a74c55faa515") : secret "dns-default-metrics-tls" not found Apr 17 18:49:51.788287 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:51.788194 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:49:51.790809 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:51.790785 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:49:51.790877 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:51.790860 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dwb49\"" Apr 17 18:49:51.978988 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:51.978957 2574 generic.go:358] "Generic (PLEG): container finished" podID="b985cdb5-8694-44dd-aa5f-2770ef10e3c4" containerID="6a6e30c046ca7dc3619c8966d32e8309274f1de1cad914249ec49f30401458a8" exitCode=0 Apr 17 18:49:51.979145 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:51.978996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerDied","Data":"6a6e30c046ca7dc3619c8966d32e8309274f1de1cad914249ec49f30401458a8"} Apr 17 18:49:52.421926 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:52.421835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:52.421926 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:52.421921 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:52.422146 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:52.422007 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:52.422146 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:52.422004 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:52.422146 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:52.422057 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls podName:3853ba23-4818-4e5c-adb0-a74c55faa515 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:54.422043395 +0000 UTC m=+38.194787785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls") pod "dns-default-r7tlb" (UID: "3853ba23-4818-4e5c-adb0-a74c55faa515") : secret "dns-default-metrics-tls" not found Apr 17 18:49:52.422146 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:52.422078 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert podName:02bba305-18ac-410d-ab1e-0abfaf32082a nodeName:}" failed. No retries permitted until 2026-04-17 18:49:54.422072563 +0000 UTC m=+38.194816949 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert") pod "ingress-canary-q6kpz" (UID: "02bba305-18ac-410d-ab1e-0abfaf32082a") : secret "canary-serving-cert" not found Apr 17 18:49:52.984305 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:52.984106 2574 generic.go:358] "Generic (PLEG): container finished" podID="b985cdb5-8694-44dd-aa5f-2770ef10e3c4" containerID="c3b781bbdb78f7670086898591944e58a02f8f54d2c7382b459b0ccde9984f11" exitCode=0 Apr 17 18:49:52.984674 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:52.984189 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerDied","Data":"c3b781bbdb78f7670086898591944e58a02f8f54d2c7382b459b0ccde9984f11"} Apr 17 18:49:53.227576 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:53.227543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:53.230829 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:53.230807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4cde13d3-2f3b-4ae9-b90e-00369cefc3cf-original-pull-secret\") pod \"global-pull-secret-syncer-bq86n\" (UID: \"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf\") " pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:53.505202 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:53.505169 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bq86n" Apr 17 18:49:53.679404 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:53.679372 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bq86n"] Apr 17 18:49:53.683896 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:49:53.683862 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cde13d3_2f3b_4ae9_b90e_00369cefc3cf.slice/crio-02daeecdb1557fbcaa46db0409a88db31cbba6c5525ab1cfb7cdb794330a6d36 WatchSource:0}: Error finding container 02daeecdb1557fbcaa46db0409a88db31cbba6c5525ab1cfb7cdb794330a6d36: Status 404 returned error can't find the container with id 02daeecdb1557fbcaa46db0409a88db31cbba6c5525ab1cfb7cdb794330a6d36 Apr 17 18:49:53.989178 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:53.988783 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" event={"ID":"b985cdb5-8694-44dd-aa5f-2770ef10e3c4","Type":"ContainerStarted","Data":"0f3d656811f0c83cc63a6a99d44dbc59fc388cf35e7c6f4c6249159c7330b086"} Apr 17 18:49:53.989811 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:53.989791 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bq86n" event={"ID":"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf","Type":"ContainerStarted","Data":"02daeecdb1557fbcaa46db0409a88db31cbba6c5525ab1cfb7cdb794330a6d36"} Apr 17 18:49:54.008625 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:54.008574 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4xm2n" podStartSLOduration=6.211823879 podStartE2EDuration="38.00856172s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:49:19.443991667 +0000 UTC m=+3.216736055" lastFinishedPulling="2026-04-17 18:49:51.240729505 +0000 UTC m=+35.013473896" observedRunningTime="2026-04-17 18:49:54.007382878 +0000 UTC m=+37.780127286" watchObservedRunningTime="2026-04-17 18:49:54.00856172 +0000 UTC m=+37.781306127" Apr 17 18:49:54.436584 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:54.436551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:54.436782 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:54.436634 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:54.436782 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:54.436741 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:54.436876 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:54.436870 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls podName:3853ba23-4818-4e5c-adb0-a74c55faa515 nodeName:}" failed. No retries permitted until 2026-04-17 18:49:58.436850612 +0000 UTC m=+42.209594998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls") pod "dns-default-r7tlb" (UID: "3853ba23-4818-4e5c-adb0-a74c55faa515") : secret "dns-default-metrics-tls" not found Apr 17 18:49:54.437265 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:54.437248 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:54.437337 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:54.437292 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert podName:02bba305-18ac-410d-ab1e-0abfaf32082a nodeName:}" failed. No retries permitted until 2026-04-17 18:49:58.437280825 +0000 UTC m=+42.210025211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert") pod "ingress-canary-q6kpz" (UID: "02bba305-18ac-410d-ab1e-0abfaf32082a") : secret "canary-serving-cert" not found Apr 17 18:49:58.466672 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:58.466631 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:49:58.467136 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:58.466700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:49:58.467136 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:58.466816 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:49:58.467136 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:58.466856 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:49:58.467136 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:58.466909 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls podName:3853ba23-4818-4e5c-adb0-a74c55faa515 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:06.466884909 +0000 UTC m=+50.239629319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls") pod "dns-default-r7tlb" (UID: "3853ba23-4818-4e5c-adb0-a74c55faa515") : secret "dns-default-metrics-tls" not found Apr 17 18:49:58.467136 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:49:58.466931 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert podName:02bba305-18ac-410d-ab1e-0abfaf32082a nodeName:}" failed. No retries permitted until 2026-04-17 18:50:06.466922056 +0000 UTC m=+50.239666449 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert") pod "ingress-canary-q6kpz" (UID: "02bba305-18ac-410d-ab1e-0abfaf32082a") : secret "canary-serving-cert" not found Apr 17 18:49:59.001673 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:49:59.001634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bq86n" event={"ID":"4cde13d3-2f3b-4ae9-b90e-00369cefc3cf","Type":"ContainerStarted","Data":"c27083fe07507d0b3e8388208b041bc0de3fb98491e78452c8a877f0351657fe"} Apr 17 18:50:06.069757 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.069700 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bq86n" podStartSLOduration=39.932183344 podStartE2EDuration="45.069680347s" podCreationTimestamp="2026-04-17 18:49:21 +0000 UTC" firstStartedPulling="2026-04-17 18:49:53.685415147 +0000 UTC m=+37.458159533" lastFinishedPulling="2026-04-17 18:49:58.82291215 +0000 UTC m=+42.595656536" observedRunningTime="2026-04-17 18:49:59.015075654 +0000 UTC m=+42.787820064" watchObservedRunningTime="2026-04-17 18:50:06.069680347 +0000 UTC m=+49.842424761" Apr 17 18:50:06.070213 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.069864 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4"] Apr 17 18:50:06.106531 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.106503 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4"] Apr 17 18:50:06.106697 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.106633 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.109649 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.109609 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 18:50:06.109649 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.109643 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 18:50:06.109859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.109610 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 18:50:06.109859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.109673 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 18:50:06.109859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.109695 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 18:50:06.109859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.109731 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 18:50:06.110519 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.110498 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 18:50:06.226667 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.226625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/40f722db-717a-402c-bd77-e411dbe58fde-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.226667 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.226669 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.226904 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.226696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-hub\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.226904 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.226741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-ca\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.226904 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.226758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.226904 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.226840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn9nh\" (UniqueName: \"kubernetes.io/projected/40f722db-717a-402c-bd77-e411dbe58fde-kube-api-access-gn9nh\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.327826 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.327739 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gn9nh\" (UniqueName: \"kubernetes.io/projected/40f722db-717a-402c-bd77-e411dbe58fde-kube-api-access-gn9nh\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.327957 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.327861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/40f722db-717a-402c-bd77-e411dbe58fde-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.327957 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.327891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.327957 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.327931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-hub\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.328111 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.328089 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-ca\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.328157 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.328132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.328739 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.328715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/40f722db-717a-402c-bd77-e411dbe58fde-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.331703 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.331676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-ca\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.331826 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.331683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.331826 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.331722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-hub\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.331826 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.331735 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/40f722db-717a-402c-bd77-e411dbe58fde-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.335408 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.335388 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn9nh\" (UniqueName: \"kubernetes.io/projected/40f722db-717a-402c-bd77-e411dbe58fde-kube-api-access-gn9nh\") pod \"cluster-proxy-proxy-agent-5dc76d74db-7w9v4\" (UID: \"40f722db-717a-402c-bd77-e411dbe58fde\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.429682 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.429640 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:50:06.529380 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.529350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:50:06.529530 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.529423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:50:06.529530 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:06.529506 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:50:06.529603 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:06.529572 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert podName:02bba305-18ac-410d-ab1e-0abfaf32082a nodeName:}" failed. No retries permitted until 2026-04-17 18:50:22.529555884 +0000 UTC m=+66.302300274 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert") pod "ingress-canary-q6kpz" (UID: "02bba305-18ac-410d-ab1e-0abfaf32082a") : secret "canary-serving-cert" not found Apr 17 18:50:06.529603 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:06.529509 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:50:06.529673 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:06.529651 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls podName:3853ba23-4818-4e5c-adb0-a74c55faa515 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:22.529638036 +0000 UTC m=+66.302382441 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls") pod "dns-default-r7tlb" (UID: "3853ba23-4818-4e5c-adb0-a74c55faa515") : secret "dns-default-metrics-tls" not found Apr 17 18:50:06.544237 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:06.544210 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4"] Apr 17 18:50:06.548242 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:50:06.548217 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40f722db_717a_402c_bd77_e411dbe58fde.slice/crio-63bf95e97b6fc53df0a1b88d55c4a46b943cecba9685801b2561e3e85f6aa6bf WatchSource:0}: Error finding container 63bf95e97b6fc53df0a1b88d55c4a46b943cecba9685801b2561e3e85f6aa6bf: Status 404 returned error can't find the container with id 63bf95e97b6fc53df0a1b88d55c4a46b943cecba9685801b2561e3e85f6aa6bf Apr 17 18:50:07.017987 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:07.017955 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" event={"ID":"40f722db-717a-402c-bd77-e411dbe58fde","Type":"ContainerStarted","Data":"63bf95e97b6fc53df0a1b88d55c4a46b943cecba9685801b2561e3e85f6aa6bf"} Apr 17 18:50:15.974890 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:15.974863 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rj69g" Apr 17 18:50:22.534851 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.534806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:50:22.535416 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.534881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:50:22.535416 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.534904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:50:22.535416 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:22.534974 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:50:22.535416 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:22.535003 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:50:22.535416 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:22.535055 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls podName:3853ba23-4818-4e5c-adb0-a74c55faa515 nodeName:}" failed. No retries permitted until 2026-04-17 18:50:54.535034452 +0000 UTC m=+98.307778859 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls") pod "dns-default-r7tlb" (UID: "3853ba23-4818-4e5c-adb0-a74c55faa515") : secret "dns-default-metrics-tls" not found Apr 17 18:50:22.535416 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:22.535075 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert podName:02bba305-18ac-410d-ab1e-0abfaf32082a nodeName:}" failed. No retries permitted until 2026-04-17 18:50:54.535065667 +0000 UTC m=+98.307810060 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert") pod "ingress-canary-q6kpz" (UID: "02bba305-18ac-410d-ab1e-0abfaf32082a") : secret "canary-serving-cert" not found Apr 17 18:50:22.537572 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.537547 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:50:22.545731 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:22.545708 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 18:50:22.545842 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:50:22.545759 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs podName:9e29f722-7b28-401a-9488-46ff42062854 nodeName:}" failed. No retries permitted until 2026-04-17 18:51:26.545744884 +0000 UTC m=+130.318489270 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs") pod "network-metrics-daemon-dpqmj" (UID: "9e29f722-7b28-401a-9488-46ff42062854") : secret "metrics-daemon-secret" not found Apr 17 18:50:22.635820 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.635783 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:50:22.639422 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.639405 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:50:22.648844 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.648825 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:50:22.659369 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.659343 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwwpv\" (UniqueName: \"kubernetes.io/projected/240834a9-2dd7-4a8c-8c31-3bcd8ec75854-kube-api-access-rwwpv\") pod \"network-check-target-4wxbg\" (UID: \"240834a9-2dd7-4a8c-8c31-3bcd8ec75854\") " pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:50:22.902224 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.902194 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tlcmp\"" Apr 17 18:50:22.910907 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:22.910885 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:50:23.024424 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:23.024393 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4wxbg"] Apr 17 18:50:23.028286 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:50:23.028258 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod240834a9_2dd7_4a8c_8c31_3bcd8ec75854.slice/crio-df17bcca165597b93d444f509e61925fd856adad20e81e32c607e3498e185e55 WatchSource:0}: Error finding container df17bcca165597b93d444f509e61925fd856adad20e81e32c607e3498e185e55: Status 404 returned error can't find the container with id df17bcca165597b93d444f509e61925fd856adad20e81e32c607e3498e185e55 Apr 17 18:50:23.051527 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:23.051499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4wxbg" event={"ID":"240834a9-2dd7-4a8c-8c31-3bcd8ec75854","Type":"ContainerStarted","Data":"df17bcca165597b93d444f509e61925fd856adad20e81e32c607e3498e185e55"} Apr 17 18:50:28.064760 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:28.064707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" event={"ID":"40f722db-717a-402c-bd77-e411dbe58fde","Type":"ContainerStarted","Data":"4ebf4b8c3f86c8f5ab07e490700f00b635bc3397f0b1b2f02eca159b0930dabb"} Apr 17 18:50:28.066335 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:28.066298 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4wxbg" event={"ID":"240834a9-2dd7-4a8c-8c31-3bcd8ec75854","Type":"ContainerStarted","Data":"a7eb5c9c346a80c0ea705e621585e5b311546b0f02a141ac879c677c0c49ab5f"} Apr 17 18:50:28.066478 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:28.066453 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:50:28.080285 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:28.080229 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4wxbg" podStartSLOduration=67.813897358 podStartE2EDuration="1m12.08021198s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:50:23.030189199 +0000 UTC m=+66.802933585" lastFinishedPulling="2026-04-17 18:50:27.29650382 +0000 UTC m=+71.069248207" observedRunningTime="2026-04-17 18:50:28.080171286 +0000 UTC m=+71.852915695" watchObservedRunningTime="2026-04-17 18:50:28.08021198 +0000 UTC m=+71.852956390" Apr 17 18:50:30.072255 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:30.072220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" event={"ID":"40f722db-717a-402c-bd77-e411dbe58fde","Type":"ContainerStarted","Data":"a17578275649a09753115f8978011607b3270507e8db5de2616dd828ecb3f086"} Apr 17 18:50:30.072255 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:30.072258 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" event={"ID":"40f722db-717a-402c-bd77-e411dbe58fde","Type":"ContainerStarted","Data":"3f2e8a675a0c3fbd4aac5264ee8b0c4792d8def051f5bdf3ba48f31b6f4fa164"} Apr 17 18:50:30.090377 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:30.090333 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" podStartSLOduration=0.947722397 podStartE2EDuration="24.090321538s" podCreationTimestamp="2026-04-17 18:50:06 +0000 UTC" firstStartedPulling="2026-04-17 18:50:06.550309472 +0000 UTC m=+50.323053858" lastFinishedPulling="2026-04-17 18:50:29.692908613 +0000 UTC m=+73.465652999" observedRunningTime="2026-04-17 18:50:30.088795032 +0000 UTC m=+73.861539443" watchObservedRunningTime="2026-04-17 18:50:30.090321538 +0000 UTC m=+73.863065945" Apr 17 18:50:37.409372 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.409237 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8"] Apr 17 18:50:37.412265 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.412244 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.414488 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.414454 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 18:50:37.414605 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.414545 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 18:50:37.414605 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.414556 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:50:37.414605 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.414550 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-kvk7j\"" Apr 17 18:50:37.415416 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.415402 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 18:50:37.421258 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.421235 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8"] Apr 17 18:50:37.446018 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.445990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.446145 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.446060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rsx\" (UniqueName: \"kubernetes.io/projected/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-kube-api-access-r4rsx\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.446145 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.446098 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.546795 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.546736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.546977 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.546844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rsx\" (UniqueName: \"kubernetes.io/projected/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-kube-api-access-r4rsx\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.546977 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.546882 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.547974 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.547952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.549147 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.549128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.555026 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.555004 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rsx\" (UniqueName: \"kubernetes.io/projected/73d6f00a-e262-4e05-a576-fa1aa63bd8a8-kube-api-access-r4rsx\") pod \"kube-storage-version-migrator-operator-6769c5d45-n5kw8\" (UID: \"73d6f00a-e262-4e05-a576-fa1aa63bd8a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.721362 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.721277 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" Apr 17 18:50:37.834699 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:37.834666 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8"] Apr 17 18:50:37.837660 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:50:37.837636 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d6f00a_e262_4e05_a576_fa1aa63bd8a8.slice/crio-377c9fb3fadd9b1cf58a5aee273ae52b84508be486c57c6755b3add5e1617528 WatchSource:0}: Error finding container 377c9fb3fadd9b1cf58a5aee273ae52b84508be486c57c6755b3add5e1617528: Status 404 returned error can't find the container with id 377c9fb3fadd9b1cf58a5aee273ae52b84508be486c57c6755b3add5e1617528 Apr 17 18:50:38.088544 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:38.088513 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" event={"ID":"73d6f00a-e262-4e05-a576-fa1aa63bd8a8","Type":"ContainerStarted","Data":"377c9fb3fadd9b1cf58a5aee273ae52b84508be486c57c6755b3add5e1617528"} Apr 17 18:50:40.094390 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:40.094345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" event={"ID":"73d6f00a-e262-4e05-a576-fa1aa63bd8a8","Type":"ContainerStarted","Data":"0c45b44172218b4b53b5efb7fd8017ab31b908c208c5536c484d4188e70a5c38"} Apr 17 18:50:40.108216 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:40.108165 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" podStartSLOduration=1.047843553 podStartE2EDuration="3.108148376s" podCreationTimestamp="2026-04-17 18:50:37 +0000 UTC" firstStartedPulling="2026-04-17 18:50:37.839639602 +0000 UTC m=+81.612383989" lastFinishedPulling="2026-04-17 18:50:39.899944422 +0000 UTC m=+83.672688812" observedRunningTime="2026-04-17 18:50:40.107474143 +0000 UTC m=+83.880218595" watchObservedRunningTime="2026-04-17 18:50:40.108148376 +0000 UTC m=+83.880892786" Apr 17 18:50:43.006318 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:43.006289 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fzlng_19201734-1263-46e9-b401-4768c56c505c/dns-node-resolver/0.log" Apr 17 18:50:44.006097 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:44.006067 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n7gs7_075172fd-6f0b-45b8-8765-5b6397bdb2b8/node-ca/0.log" Apr 17 18:50:54.568995 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.568961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:50:54.569396 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.569019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:50:54.571460 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.571436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3853ba23-4818-4e5c-adb0-a74c55faa515-metrics-tls\") pod \"dns-default-r7tlb\" (UID: \"3853ba23-4818-4e5c-adb0-a74c55faa515\") " pod="openshift-dns/dns-default-r7tlb" Apr 17 18:50:54.571568 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.571516 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02bba305-18ac-410d-ab1e-0abfaf32082a-cert\") pod \"ingress-canary-q6kpz\" (UID: \"02bba305-18ac-410d-ab1e-0abfaf32082a\") " pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:50:54.583498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.583471 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ppj8p\"" Apr 17 18:50:54.592407 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.592389 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-q6kpz" Apr 17 18:50:54.710128 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.710098 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-q6kpz"] Apr 17 18:50:54.713418 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:50:54.713395 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02bba305_18ac_410d_ab1e_0abfaf32082a.slice/crio-4ef072bddc028db54017097d6ae846001c3ad5c2f39e67d614e10e91dcbd3204 WatchSource:0}: Error finding container 4ef072bddc028db54017097d6ae846001c3ad5c2f39e67d614e10e91dcbd3204: Status 404 returned error can't find the container with id 4ef072bddc028db54017097d6ae846001c3ad5c2f39e67d614e10e91dcbd3204 Apr 17 18:50:54.869470 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.869395 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctvd8\"" Apr 17 18:50:54.878167 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.878146 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-r7tlb" Apr 17 18:50:54.988031 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:54.988004 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-r7tlb"] Apr 17 18:50:54.991253 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:50:54.991229 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3853ba23_4818_4e5c_adb0_a74c55faa515.slice/crio-645b0dd03259464bdac8a4f94a566e0ca38775bc0b6aa51ca50af9edf7c9186b WatchSource:0}: Error finding container 645b0dd03259464bdac8a4f94a566e0ca38775bc0b6aa51ca50af9edf7c9186b: Status 404 returned error can't find the container with id 645b0dd03259464bdac8a4f94a566e0ca38775bc0b6aa51ca50af9edf7c9186b Apr 17 18:50:55.131735 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:55.131652 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r7tlb" event={"ID":"3853ba23-4818-4e5c-adb0-a74c55faa515","Type":"ContainerStarted","Data":"645b0dd03259464bdac8a4f94a566e0ca38775bc0b6aa51ca50af9edf7c9186b"} Apr 17 18:50:55.132679 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:55.132651 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q6kpz" event={"ID":"02bba305-18ac-410d-ab1e-0abfaf32082a","Type":"ContainerStarted","Data":"4ef072bddc028db54017097d6ae846001c3ad5c2f39e67d614e10e91dcbd3204"} Apr 17 18:50:58.143968 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:58.143929 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r7tlb" event={"ID":"3853ba23-4818-4e5c-adb0-a74c55faa515","Type":"ContainerStarted","Data":"7ba17f85539d4a9a240a9e4888724e0fd2081387dac5632987fae0945bdb7e3d"} Apr 17 18:50:58.143968 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:58.143971 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-r7tlb" event={"ID":"3853ba23-4818-4e5c-adb0-a74c55faa515","Type":"ContainerStarted","Data":"b329340cc8e4682f80cebe82cc4d5aee09364b4fdb067d670288046cc25478a1"} Apr 17 18:50:58.144477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:58.144035 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-r7tlb" Apr 17 18:50:58.145224 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:58.145199 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-q6kpz" event={"ID":"02bba305-18ac-410d-ab1e-0abfaf32082a","Type":"ContainerStarted","Data":"b6b5c88ef505c0a54e39fd124526e14302a0adc9dc0ed09b394ebcd6b4b53272"} Apr 17 18:50:58.161133 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:58.161061 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-r7tlb" podStartSLOduration=65.827045565 podStartE2EDuration="1m8.161049672s" podCreationTimestamp="2026-04-17 18:49:50 +0000 UTC" firstStartedPulling="2026-04-17 18:50:54.993015483 +0000 UTC m=+98.765759870" lastFinishedPulling="2026-04-17 18:50:57.327019587 +0000 UTC m=+101.099763977" observedRunningTime="2026-04-17 18:50:58.160311228 +0000 UTC m=+101.933055635" watchObservedRunningTime="2026-04-17 18:50:58.161049672 +0000 UTC m=+101.933794080" Apr 17 18:50:58.173814 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:58.173750 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-q6kpz" podStartSLOduration=65.56594762 podStartE2EDuration="1m8.173739421s" podCreationTimestamp="2026-04-17 18:49:50 +0000 UTC" firstStartedPulling="2026-04-17 18:50:54.71564724 +0000 UTC m=+98.488391625" lastFinishedPulling="2026-04-17 18:50:57.323439036 +0000 UTC m=+101.096183426" observedRunningTime="2026-04-17 18:50:58.17359706 +0000 UTC m=+101.946341468" watchObservedRunningTime="2026-04-17 18:50:58.173739421 +0000 UTC m=+101.946483828" Apr 17 18:50:59.071032 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:50:59.070993 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4wxbg" Apr 17 18:51:08.109524 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.109446 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f445ccfd9-bm6zt"] Apr 17 18:51:08.111478 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.111456 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.114470 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.114446 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 18:51:08.115053 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.115025 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 18:51:08.115133 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.115030 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-2m8h2\"" Apr 17 18:51:08.115423 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.115408 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 18:51:08.119238 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.119218 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-66qdh"] Apr 17 18:51:08.121262 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.121240 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 18:51:08.121355 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.121327 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.123436 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.123415 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-njpjx\"" Apr 17 18:51:08.123539 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.123447 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 18:51:08.123803 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.123785 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 18:51:08.123907 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.123890 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 18:51:08.123976 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.123942 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 18:51:08.126653 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.126630 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f445ccfd9-bm6zt"] Apr 17 18:51:08.134346 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.134326 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-66qdh"] Apr 17 18:51:08.150319 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.150297 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-r7tlb" Apr 17 18:51:08.272653 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.272615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks29f\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-kube-api-access-ks29f\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.272653 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.272654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8fdc99b-b5cd-41d2-b3ec-c73459772491-data-volume\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.272891 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.272738 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b8fdc99b-b5cd-41d2-b3ec-c73459772491-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.272891 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.272841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b8fdc99b-b5cd-41d2-b3ec-c73459772491-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.272891 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.272872 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-bound-sa-token\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.273000 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.272901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b8fdc99b-b5cd-41d2-b3ec-c73459772491-crio-socket\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.273000 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.272957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-registry-tls\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.273061 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.272996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/638e9ce4-6adf-4245-a9d2-47e2df3045e6-ca-trust-extracted\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.273061 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.273022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/638e9ce4-6adf-4245-a9d2-47e2df3045e6-registry-certificates\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.273061 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.273040 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/638e9ce4-6adf-4245-a9d2-47e2df3045e6-installation-pull-secrets\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.273169 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.273081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjdht\" (UniqueName: \"kubernetes.io/projected/b8fdc99b-b5cd-41d2-b3ec-c73459772491-kube-api-access-wjdht\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.273220 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.273203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/638e9ce4-6adf-4245-a9d2-47e2df3045e6-image-registry-private-configuration\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.273269 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.273233 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/638e9ce4-6adf-4245-a9d2-47e2df3045e6-trusted-ca\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374129 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374054 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b8fdc99b-b5cd-41d2-b3ec-c73459772491-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.374129 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-bound-sa-token\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374129 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b8fdc99b-b5cd-41d2-b3ec-c73459772491-crio-socket\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.374129 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-registry-tls\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/638e9ce4-6adf-4245-a9d2-47e2df3045e6-ca-trust-extracted\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/638e9ce4-6adf-4245-a9d2-47e2df3045e6-registry-certificates\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374182 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/638e9ce4-6adf-4245-a9d2-47e2df3045e6-installation-pull-secrets\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjdht\" (UniqueName: \"kubernetes.io/projected/b8fdc99b-b5cd-41d2-b3ec-c73459772491-kube-api-access-wjdht\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b8fdc99b-b5cd-41d2-b3ec-c73459772491-crio-socket\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/638e9ce4-6adf-4245-a9d2-47e2df3045e6-image-registry-private-configuration\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/638e9ce4-6adf-4245-a9d2-47e2df3045e6-trusted-ca\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374292 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks29f\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-kube-api-access-ks29f\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8fdc99b-b5cd-41d2-b3ec-c73459772491-data-volume\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.374388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b8fdc99b-b5cd-41d2-b3ec-c73459772491-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.374965 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/638e9ce4-6adf-4245-a9d2-47e2df3045e6-ca-trust-extracted\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.374965 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b8fdc99b-b5cd-41d2-b3ec-c73459772491-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.375072 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.374835 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b8fdc99b-b5cd-41d2-b3ec-c73459772491-data-volume\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.375416 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.375393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/638e9ce4-6adf-4245-a9d2-47e2df3045e6-trusted-ca\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.375713 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.375692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/638e9ce4-6adf-4245-a9d2-47e2df3045e6-registry-certificates\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.376847 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.376827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-registry-tls\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.377344 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.377325 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/638e9ce4-6adf-4245-a9d2-47e2df3045e6-installation-pull-secrets\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.377388 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.377342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b8fdc99b-b5cd-41d2-b3ec-c73459772491-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.377519 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.377497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/638e9ce4-6adf-4245-a9d2-47e2df3045e6-image-registry-private-configuration\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.381346 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.381323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-bound-sa-token\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.381708 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.381689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjdht\" (UniqueName: \"kubernetes.io/projected/b8fdc99b-b5cd-41d2-b3ec-c73459772491-kube-api-access-wjdht\") pod \"insights-runtime-extractor-66qdh\" (UID: \"b8fdc99b-b5cd-41d2-b3ec-c73459772491\") " pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.381812 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.381756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks29f\" (UniqueName: \"kubernetes.io/projected/638e9ce4-6adf-4245-a9d2-47e2df3045e6-kube-api-access-ks29f\") pod \"image-registry-f445ccfd9-bm6zt\" (UID: \"638e9ce4-6adf-4245-a9d2-47e2df3045e6\") " pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.420843 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.420818 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:08.430482 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.430453 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-66qdh" Apr 17 18:51:08.554000 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.553970 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f445ccfd9-bm6zt"] Apr 17 18:51:08.557254 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:51:08.557215 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod638e9ce4_6adf_4245_a9d2_47e2df3045e6.slice/crio-d458311ed156d57c8d283349de0cec6344b5372b1dcf332d8479ca321dab3996 WatchSource:0}: Error finding container d458311ed156d57c8d283349de0cec6344b5372b1dcf332d8479ca321dab3996: Status 404 returned error can't find the container with id d458311ed156d57c8d283349de0cec6344b5372b1dcf332d8479ca321dab3996 Apr 17 18:51:08.567810 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:08.567742 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-66qdh"] Apr 17 18:51:08.570176 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:51:08.570155 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8fdc99b_b5cd_41d2_b3ec_c73459772491.slice/crio-79b0c0014c03b6e20ca0b9e329334542680069a2d19c11943712aab75885fa2c WatchSource:0}: Error finding container 79b0c0014c03b6e20ca0b9e329334542680069a2d19c11943712aab75885fa2c: Status 404 returned error can't find the container with id 79b0c0014c03b6e20ca0b9e329334542680069a2d19c11943712aab75885fa2c Apr 17 18:51:09.180082 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:09.180048 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-66qdh" event={"ID":"b8fdc99b-b5cd-41d2-b3ec-c73459772491","Type":"ContainerStarted","Data":"3e0109c1f5d46ceb5793e06848d7933d9dfe4efafe097fd49ed9ab98187a404f"} Apr 17 18:51:09.180534 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:09.180091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-66qdh" event={"ID":"b8fdc99b-b5cd-41d2-b3ec-c73459772491","Type":"ContainerStarted","Data":"79b0c0014c03b6e20ca0b9e329334542680069a2d19c11943712aab75885fa2c"} Apr 17 18:51:09.181491 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:09.181463 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" event={"ID":"638e9ce4-6adf-4245-a9d2-47e2df3045e6","Type":"ContainerStarted","Data":"c538c56f86db6184ff43e27ed4f00c3ac3dde7cc4be6c06175f9e3deec2365c2"} Apr 17 18:51:09.181599 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:09.181494 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" event={"ID":"638e9ce4-6adf-4245-a9d2-47e2df3045e6","Type":"ContainerStarted","Data":"d458311ed156d57c8d283349de0cec6344b5372b1dcf332d8479ca321dab3996"} Apr 17 18:51:09.181663 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:09.181647 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:09.199741 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:09.199701 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" podStartSLOduration=1.199688565 podStartE2EDuration="1.199688565s" podCreationTimestamp="2026-04-17 18:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:51:09.198748521 +0000 UTC m=+112.971492930" watchObservedRunningTime="2026-04-17 18:51:09.199688565 +0000 UTC m=+112.972432972" Apr 17 18:51:10.188539 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:10.188502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-66qdh" event={"ID":"b8fdc99b-b5cd-41d2-b3ec-c73459772491","Type":"ContainerStarted","Data":"dbbcb64508cf5ca0f8b4e745d1beb0168346d3d7ed012fd51e1213b61870c1ba"} Apr 17 18:51:12.195616 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:12.195533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-66qdh" event={"ID":"b8fdc99b-b5cd-41d2-b3ec-c73459772491","Type":"ContainerStarted","Data":"51b0edabeebe912c0e179e8a09b587a45dd32e51cc5ed5f8f900cb1e5650f55b"} Apr 17 18:51:12.215321 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:12.215272 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-66qdh" podStartSLOduration=0.935848423 podStartE2EDuration="4.215259711s" podCreationTimestamp="2026-04-17 18:51:08 +0000 UTC" firstStartedPulling="2026-04-17 18:51:08.613626405 +0000 UTC m=+112.386370791" lastFinishedPulling="2026-04-17 18:51:11.893037688 +0000 UTC m=+115.665782079" observedRunningTime="2026-04-17 18:51:12.214559383 +0000 UTC m=+115.987303788" watchObservedRunningTime="2026-04-17 18:51:12.215259711 +0000 UTC m=+115.988004119" Apr 17 18:51:15.581502 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.581458 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hld44"] Apr 17 18:51:15.583937 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.583920 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.587548 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.587523 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 18:51:15.587676 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.587543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 18:51:15.587676 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.587543 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 18:51:15.587676 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.587551 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-m8gpg\"" Apr 17 18:51:15.587676 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.587560 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 18:51:15.587922 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.587558 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 18:51:15.591103 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.591082 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hld44"] Apr 17 18:51:15.732683 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.732642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89r9q\" (UniqueName: \"kubernetes.io/projected/2261af19-5997-4854-9384-97c64c2d7dc4-kube-api-access-89r9q\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.732865 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.732783 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2261af19-5997-4854-9384-97c64c2d7dc4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.732865 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.732820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2261af19-5997-4854-9384-97c64c2d7dc4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.732865 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.732844 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2261af19-5997-4854-9384-97c64c2d7dc4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.833937 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.833854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2261af19-5997-4854-9384-97c64c2d7dc4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.833937 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.833893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2261af19-5997-4854-9384-97c64c2d7dc4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.833937 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.833912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2261af19-5997-4854-9384-97c64c2d7dc4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.834205 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.833957 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89r9q\" (UniqueName: \"kubernetes.io/projected/2261af19-5997-4854-9384-97c64c2d7dc4-kube-api-access-89r9q\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.834617 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.834596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2261af19-5997-4854-9384-97c64c2d7dc4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.836417 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.836395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2261af19-5997-4854-9384-97c64c2d7dc4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.836504 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.836395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/2261af19-5997-4854-9384-97c64c2d7dc4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.841960 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.841939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89r9q\" (UniqueName: \"kubernetes.io/projected/2261af19-5997-4854-9384-97c64c2d7dc4-kube-api-access-89r9q\") pod \"prometheus-operator-5676c8c784-hld44\" (UID: \"2261af19-5997-4854-9384-97c64c2d7dc4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:15.893904 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:15.893860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" Apr 17 18:51:16.008330 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:16.008302 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hld44"] Apr 17 18:51:16.011336 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:51:16.011309 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2261af19_5997_4854_9384_97c64c2d7dc4.slice/crio-5b1fa2118250faa9a10ab2a090f57090bd25fd35cd47377c8b1d10c7652401e3 WatchSource:0}: Error finding container 5b1fa2118250faa9a10ab2a090f57090bd25fd35cd47377c8b1d10c7652401e3: Status 404 returned error can't find the container with id 5b1fa2118250faa9a10ab2a090f57090bd25fd35cd47377c8b1d10c7652401e3 Apr 17 18:51:16.208319 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:16.208237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" event={"ID":"2261af19-5997-4854-9384-97c64c2d7dc4","Type":"ContainerStarted","Data":"5b1fa2118250faa9a10ab2a090f57090bd25fd35cd47377c8b1d10c7652401e3"} Apr 17 18:51:18.214648 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:18.214605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" event={"ID":"2261af19-5997-4854-9384-97c64c2d7dc4","Type":"ContainerStarted","Data":"187c96e97c6e11be5e4dc46aca379de76742fe75b0ac80550f82809c22843fba"} Apr 17 18:51:18.214648 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:18.214648 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" event={"ID":"2261af19-5997-4854-9384-97c64c2d7dc4","Type":"ContainerStarted","Data":"7dcb47cd81681c1044eccaaccde6fd37a2c512ef6b4393663ed34089e43341a2"} Apr 17 18:51:18.230477 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:18.230408 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-hld44" podStartSLOduration=2.002831642 podStartE2EDuration="3.230395369s" podCreationTimestamp="2026-04-17 18:51:15 +0000 UTC" firstStartedPulling="2026-04-17 18:51:16.013134091 +0000 UTC m=+119.785878476" lastFinishedPulling="2026-04-17 18:51:17.240697816 +0000 UTC m=+121.013442203" observedRunningTime="2026-04-17 18:51:18.229330378 +0000 UTC m=+122.002074786" watchObservedRunningTime="2026-04-17 18:51:18.230395369 +0000 UTC m=+122.003139777" Apr 17 18:51:19.966872 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:19.966840 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zlk7p"] Apr 17 18:51:19.968984 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:19.968968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:19.971236 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:19.971211 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 18:51:19.971377 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:19.971257 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7jr9c\"" Apr 17 18:51:19.971377 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:19.971218 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 18:51:19.971377 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:19.971308 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 18:51:20.064987 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.064947 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-wtmp\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.064987 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.064993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.065195 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.065015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-accelerators-collector-config\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.065195 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.065081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-root\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.065195 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.065119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdgc6\" (UniqueName: \"kubernetes.io/projected/744be0a5-8d64-4004-ab9b-a120080a13b5-kube-api-access-cdgc6\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.065195 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.065161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/744be0a5-8d64-4004-ab9b-a120080a13b5-metrics-client-ca\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.065315 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.065203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-tls\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.065315 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.065227 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-sys\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.065315 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.065255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-textfile\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166231 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-root\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166374 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdgc6\" (UniqueName: \"kubernetes.io/projected/744be0a5-8d64-4004-ab9b-a120080a13b5-kube-api-access-cdgc6\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166374 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166326 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-root\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166447 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/744be0a5-8d64-4004-ab9b-a120080a13b5-metrics-client-ca\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166486 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-tls\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166486 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-sys\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166576 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-textfile\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166576 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-wtmp\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166669 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:51:20.166592 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 18:51:20.166669 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166602 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166669 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-sys\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166669 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:51:20.166655 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-tls podName:744be0a5-8d64-4004-ab9b-a120080a13b5 nodeName:}" failed. No retries permitted until 2026-04-17 18:51:20.66663349 +0000 UTC m=+124.439377893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-tls") pod "node-exporter-zlk7p" (UID: "744be0a5-8d64-4004-ab9b-a120080a13b5") : secret "node-exporter-tls" not found Apr 17 18:51:20.166894 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-accelerators-collector-config\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.166894 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-wtmp\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.167000 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-textfile\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.167000 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.166958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/744be0a5-8d64-4004-ab9b-a120080a13b5-metrics-client-ca\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.167271 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.167245 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-accelerators-collector-config\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.169351 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.169327 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.175965 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.175942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdgc6\" (UniqueName: \"kubernetes.io/projected/744be0a5-8d64-4004-ab9b-a120080a13b5-kube-api-access-cdgc6\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.670760 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.670723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-tls\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.673134 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.673101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/744be0a5-8d64-4004-ab9b-a120080a13b5-node-exporter-tls\") pod \"node-exporter-zlk7p\" (UID: \"744be0a5-8d64-4004-ab9b-a120080a13b5\") " pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.882043 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:20.882013 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zlk7p" Apr 17 18:51:20.891431 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:51:20.891396 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod744be0a5_8d64_4004_ab9b_a120080a13b5.slice/crio-288ceaab80d52d7c2d570daef9b70a0d2a26539b7d76afb0a406c8f0f1253237 WatchSource:0}: Error finding container 288ceaab80d52d7c2d570daef9b70a0d2a26539b7d76afb0a406c8f0f1253237: Status 404 returned error can't find the container with id 288ceaab80d52d7c2d570daef9b70a0d2a26539b7d76afb0a406c8f0f1253237 Apr 17 18:51:21.223104 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:21.223066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zlk7p" event={"ID":"744be0a5-8d64-4004-ab9b-a120080a13b5","Type":"ContainerStarted","Data":"288ceaab80d52d7c2d570daef9b70a0d2a26539b7d76afb0a406c8f0f1253237"} Apr 17 18:51:22.227383 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:22.227348 2574 generic.go:358] "Generic (PLEG): container finished" podID="744be0a5-8d64-4004-ab9b-a120080a13b5" containerID="66d10b7673a943b9c57550d9e1d3474fbbd7fb3025b3a7abaa6efd9c8c2ab63e" exitCode=0 Apr 17 18:51:22.227740 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:22.227429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zlk7p" event={"ID":"744be0a5-8d64-4004-ab9b-a120080a13b5","Type":"ContainerDied","Data":"66d10b7673a943b9c57550d9e1d3474fbbd7fb3025b3a7abaa6efd9c8c2ab63e"} Apr 17 18:51:23.231290 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:23.231250 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zlk7p" event={"ID":"744be0a5-8d64-4004-ab9b-a120080a13b5","Type":"ContainerStarted","Data":"0f86917c05aff3c589e022f3412e7df7d3fe2a419710b94fea0e4f43c90fb9bc"} Apr 17 18:51:23.231290 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:23.231294 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zlk7p" event={"ID":"744be0a5-8d64-4004-ab9b-a120080a13b5","Type":"ContainerStarted","Data":"f1d14ed51f429fadc798b0d0cb638fb5fba8b4e8098bb0c67e0211728546c9d4"} Apr 17 18:51:23.249090 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:23.249043 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zlk7p" podStartSLOduration=3.439467987 podStartE2EDuration="4.249029232s" podCreationTimestamp="2026-04-17 18:51:19 +0000 UTC" firstStartedPulling="2026-04-17 18:51:20.893361524 +0000 UTC m=+124.666105910" lastFinishedPulling="2026-04-17 18:51:21.702922768 +0000 UTC m=+125.475667155" observedRunningTime="2026-04-17 18:51:23.248408199 +0000 UTC m=+127.021152607" watchObservedRunningTime="2026-04-17 18:51:23.249029232 +0000 UTC m=+127.021773677" Apr 17 18:51:24.291998 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.291961 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7dc5f5d68-qvmvl"] Apr 17 18:51:24.294249 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.294234 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.296603 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.296583 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 18:51:24.297519 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.297490 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-rsvhs\"" Apr 17 18:51:24.297519 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.297506 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-2671rkl9m4jt8\"" Apr 17 18:51:24.297519 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.297515 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 18:51:24.297704 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.297497 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 18:51:24.297704 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.297500 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 18:51:24.306866 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.306845 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7dc5f5d68-qvmvl"] Apr 17 18:51:24.399654 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.399615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/92661bb5-d795-4fae-b174-a3151e1c7420-audit-log\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.399654 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.399650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/92661bb5-d795-4fae-b174-a3151e1c7420-metrics-server-audit-profiles\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.399903 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.399878 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-client-ca-bundle\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.400031 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.400003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92661bb5-d795-4fae-b174-a3151e1c7420-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.400070 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.400049 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwkl\" (UniqueName: \"kubernetes.io/projected/92661bb5-d795-4fae-b174-a3151e1c7420-kube-api-access-fwwkl\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.400139 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.400122 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-secret-metrics-server-client-certs\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.400175 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.400156 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-secret-metrics-server-tls\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501119 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501082 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/92661bb5-d795-4fae-b174-a3151e1c7420-audit-log\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501119 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/92661bb5-d795-4fae-b174-a3151e1c7420-metrics-server-audit-profiles\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501332 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-client-ca-bundle\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501332 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501219 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92661bb5-d795-4fae-b174-a3151e1c7420-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501332 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwkl\" (UniqueName: \"kubernetes.io/projected/92661bb5-d795-4fae-b174-a3151e1c7420-kube-api-access-fwwkl\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501332 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-secret-metrics-server-client-certs\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501513 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-secret-metrics-server-tls\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501564 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/92661bb5-d795-4fae-b174-a3151e1c7420-audit-log\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.501977 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.501952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92661bb5-d795-4fae-b174-a3151e1c7420-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.502438 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.502411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/92661bb5-d795-4fae-b174-a3151e1c7420-metrics-server-audit-profiles\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.503971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.503947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-secret-metrics-server-client-certs\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.504046 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.504016 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-client-ca-bundle\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.504046 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.504031 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/92661bb5-d795-4fae-b174-a3151e1c7420-secret-metrics-server-tls\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.508469 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.508446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwkl\" (UniqueName: \"kubernetes.io/projected/92661bb5-d795-4fae-b174-a3151e1c7420-kube-api-access-fwwkl\") pod \"metrics-server-7dc5f5d68-qvmvl\" (UID: \"92661bb5-d795-4fae-b174-a3151e1c7420\") " pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.602778 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.602743 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:24.731981 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:24.731950 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7dc5f5d68-qvmvl"] Apr 17 18:51:24.735147 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:51:24.735114 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92661bb5_d795_4fae_b174_a3151e1c7420.slice/crio-d7c7a12933ddf0aa7ac261091dd7fee9fb484254f2b3a9e3b37ea5bef6fd2ab0 WatchSource:0}: Error finding container d7c7a12933ddf0aa7ac261091dd7fee9fb484254f2b3a9e3b37ea5bef6fd2ab0: Status 404 returned error can't find the container with id d7c7a12933ddf0aa7ac261091dd7fee9fb484254f2b3a9e3b37ea5bef6fd2ab0 Apr 17 18:51:25.237534 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:25.237501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" event={"ID":"92661bb5-d795-4fae-b174-a3151e1c7420","Type":"ContainerStarted","Data":"d7c7a12933ddf0aa7ac261091dd7fee9fb484254f2b3a9e3b37ea5bef6fd2ab0"} Apr 17 18:51:26.617123 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:26.617087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:51:26.619697 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:26.619672 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e29f722-7b28-401a-9488-46ff42062854-metrics-certs\") pod \"network-metrics-daemon-dpqmj\" (UID: \"9e29f722-7b28-401a-9488-46ff42062854\") " pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:51:26.899327 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:26.899248 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dwb49\"" Apr 17 18:51:26.908024 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:26.908006 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dpqmj" Apr 17 18:51:27.026351 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:27.026321 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dpqmj"] Apr 17 18:51:27.029487 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:51:27.029462 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e29f722_7b28_401a_9488_46ff42062854.slice/crio-7c200db69e98a07aadb8ee75c7c949f85587740b5aad03090158615174628f02 WatchSource:0}: Error finding container 7c200db69e98a07aadb8ee75c7c949f85587740b5aad03090158615174628f02: Status 404 returned error can't find the container with id 7c200db69e98a07aadb8ee75c7c949f85587740b5aad03090158615174628f02 Apr 17 18:51:27.248539 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:27.248455 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" event={"ID":"92661bb5-d795-4fae-b174-a3151e1c7420","Type":"ContainerStarted","Data":"c2e57f8c042c6d132d1befdbae1d7afc9d560acbf801968e46613be1e8936c36"} Apr 17 18:51:27.249496 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:27.249474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dpqmj" event={"ID":"9e29f722-7b28-401a-9488-46ff42062854","Type":"ContainerStarted","Data":"7c200db69e98a07aadb8ee75c7c949f85587740b5aad03090158615174628f02"} Apr 17 18:51:27.267546 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:27.267509 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" podStartSLOduration=1.718625962 podStartE2EDuration="3.267490846s" podCreationTimestamp="2026-04-17 18:51:24 +0000 UTC" firstStartedPulling="2026-04-17 18:51:24.737324444 +0000 UTC m=+128.510068830" lastFinishedPulling="2026-04-17 18:51:26.286189329 +0000 UTC m=+130.058933714" observedRunningTime="2026-04-17 18:51:27.266378108 +0000 UTC m=+131.039122518" watchObservedRunningTime="2026-04-17 18:51:27.267490846 +0000 UTC m=+131.040235254" Apr 17 18:51:28.253134 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:28.253101 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dpqmj" event={"ID":"9e29f722-7b28-401a-9488-46ff42062854","Type":"ContainerStarted","Data":"ffa6edd3ff4f0e6f1042728758e2c857798bd8a3c51994e2cea8b64870369e3d"} Apr 17 18:51:28.253134 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:28.253134 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dpqmj" event={"ID":"9e29f722-7b28-401a-9488-46ff42062854","Type":"ContainerStarted","Data":"26851c637b96e6f62654e0117692f5785e35b655b14346e752380c0eddcccefe"} Apr 17 18:51:28.271315 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:28.271268 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dpqmj" podStartSLOduration=131.278324042 podStartE2EDuration="2m12.271251624s" podCreationTimestamp="2026-04-17 18:49:16 +0000 UTC" firstStartedPulling="2026-04-17 18:51:27.031255143 +0000 UTC m=+130.803999530" lastFinishedPulling="2026-04-17 18:51:28.024182726 +0000 UTC m=+131.796927112" observedRunningTime="2026-04-17 18:51:28.269824542 +0000 UTC m=+132.042568950" watchObservedRunningTime="2026-04-17 18:51:28.271251624 +0000 UTC m=+132.043996031" Apr 17 18:51:30.193436 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:30.193149 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-f445ccfd9-bm6zt" Apr 17 18:51:33.364332 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.364299 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-qwbn9"] Apr 17 18:51:33.366733 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.366715 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-qwbn9" Apr 17 18:51:33.368964 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.368939 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 18:51:33.369057 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.368991 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 18:51:33.369057 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.368999 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-kctrx\"" Apr 17 18:51:33.374312 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.374289 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-qwbn9"] Apr 17 18:51:33.461839 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.461796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvt7j\" (UniqueName: \"kubernetes.io/projected/585034e5-3c8d-42a7-9bd7-7f42418e3163-kube-api-access-dvt7j\") pod \"downloads-6bcc868b7-qwbn9\" (UID: \"585034e5-3c8d-42a7-9bd7-7f42418e3163\") " pod="openshift-console/downloads-6bcc868b7-qwbn9" Apr 17 18:51:33.562998 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.562964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvt7j\" (UniqueName: \"kubernetes.io/projected/585034e5-3c8d-42a7-9bd7-7f42418e3163-kube-api-access-dvt7j\") pod \"downloads-6bcc868b7-qwbn9\" (UID: \"585034e5-3c8d-42a7-9bd7-7f42418e3163\") " pod="openshift-console/downloads-6bcc868b7-qwbn9" Apr 17 18:51:33.570948 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.570919 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvt7j\" (UniqueName: \"kubernetes.io/projected/585034e5-3c8d-42a7-9bd7-7f42418e3163-kube-api-access-dvt7j\") pod \"downloads-6bcc868b7-qwbn9\" (UID: \"585034e5-3c8d-42a7-9bd7-7f42418e3163\") " pod="openshift-console/downloads-6bcc868b7-qwbn9" Apr 17 18:51:33.676450 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.676364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-qwbn9" Apr 17 18:51:33.792462 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:33.792331 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-qwbn9"] Apr 17 18:51:33.794977 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:51:33.794949 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod585034e5_3c8d_42a7_9bd7_7f42418e3163.slice/crio-caaefcd9a0571f68d525c0a7ea2b4aefbd73270cafff69bffa70e3f5f6d4d5e3 WatchSource:0}: Error finding container caaefcd9a0571f68d525c0a7ea2b4aefbd73270cafff69bffa70e3f5f6d4d5e3: Status 404 returned error can't find the container with id caaefcd9a0571f68d525c0a7ea2b4aefbd73270cafff69bffa70e3f5f6d4d5e3 Apr 17 18:51:34.268909 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:34.268868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-qwbn9" event={"ID":"585034e5-3c8d-42a7-9bd7-7f42418e3163","Type":"ContainerStarted","Data":"caaefcd9a0571f68d525c0a7ea2b4aefbd73270cafff69bffa70e3f5f6d4d5e3"} Apr 17 18:51:44.603887 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:44.603844 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:44.604374 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:44.603905 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:51:53.326252 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:53.326214 2574 generic.go:358] "Generic (PLEG): container finished" podID="73d6f00a-e262-4e05-a576-fa1aa63bd8a8" containerID="0c45b44172218b4b53b5efb7fd8017ab31b908c208c5536c484d4188e70a5c38" exitCode=0 Apr 17 18:51:53.326669 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:53.326290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" event={"ID":"73d6f00a-e262-4e05-a576-fa1aa63bd8a8","Type":"ContainerDied","Data":"0c45b44172218b4b53b5efb7fd8017ab31b908c208c5536c484d4188e70a5c38"} Apr 17 18:51:53.326710 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:53.326674 2574 scope.go:117] "RemoveContainer" containerID="0c45b44172218b4b53b5efb7fd8017ab31b908c208c5536c484d4188e70a5c38" Apr 17 18:51:53.327825 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:53.327799 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-qwbn9" event={"ID":"585034e5-3c8d-42a7-9bd7-7f42418e3163","Type":"ContainerStarted","Data":"d01d4a31d1fa731ed059a2eb65f5f8f24941c8de4557692c762046ebc3828548"} Apr 17 18:51:53.328042 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:53.328020 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-qwbn9" Apr 17 18:51:53.338215 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:53.338194 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-qwbn9" Apr 17 18:51:53.360505 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:53.360437 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-qwbn9" podStartSLOduration=1.091741144 podStartE2EDuration="20.36041983s" podCreationTimestamp="2026-04-17 18:51:33 +0000 UTC" firstStartedPulling="2026-04-17 18:51:33.796930953 +0000 UTC m=+137.569675339" lastFinishedPulling="2026-04-17 18:51:53.065609633 +0000 UTC m=+156.838354025" observedRunningTime="2026-04-17 18:51:53.359450631 +0000 UTC m=+157.132195040" watchObservedRunningTime="2026-04-17 18:51:53.36041983 +0000 UTC m=+157.133164239" Apr 17 18:51:54.333220 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:54.333182 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-n5kw8" event={"ID":"73d6f00a-e262-4e05-a576-fa1aa63bd8a8","Type":"ContainerStarted","Data":"e0962ea0d885e62184468b4c660f35c2d99f3aa431cb08d3988b15e72d6e4328"} Apr 17 18:51:56.431258 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:51:56.431195 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" podUID="40f722db-717a-402c-bd77-e411dbe58fde" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 18:52:04.608923 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:04.608891 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:52:04.612731 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:04.612708 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7dc5f5d68-qvmvl" Apr 17 18:52:06.431485 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:06.431401 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" podUID="40f722db-717a-402c-bd77-e411dbe58fde" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 18:52:16.430596 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:16.430538 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" podUID="40f722db-717a-402c-bd77-e411dbe58fde" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 18:52:16.431017 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:16.430633 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" Apr 17 18:52:16.431138 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:16.431108 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a17578275649a09753115f8978011607b3270507e8db5de2616dd828ecb3f086"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 18:52:16.431183 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:16.431169 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" podUID="40f722db-717a-402c-bd77-e411dbe58fde" containerName="service-proxy" containerID="cri-o://a17578275649a09753115f8978011607b3270507e8db5de2616dd828ecb3f086" gracePeriod=30 Apr 17 18:52:17.409242 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:17.409205 2574 generic.go:358] "Generic (PLEG): container finished" podID="40f722db-717a-402c-bd77-e411dbe58fde" containerID="a17578275649a09753115f8978011607b3270507e8db5de2616dd828ecb3f086" exitCode=2 Apr 17 18:52:17.409409 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:17.409276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" event={"ID":"40f722db-717a-402c-bd77-e411dbe58fde","Type":"ContainerDied","Data":"a17578275649a09753115f8978011607b3270507e8db5de2616dd828ecb3f086"} Apr 17 18:52:17.409409 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:52:17.409316 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5dc76d74db-7w9v4" event={"ID":"40f722db-717a-402c-bd77-e411dbe58fde","Type":"ContainerStarted","Data":"066cf5a03a3cdf20aa94641613c130be9ba301c2137bfc1194a15b91b8414aa2"} Apr 17 18:54:16.662941 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:54:16.662908 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 18:54:16.663480 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:54:16.663286 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 18:54:16.668575 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:54:16.668541 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 18:55:31.673050 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.673011 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-gm25f"] Apr 17 18:55:31.676028 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.676011 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" Apr 17 18:55:31.678127 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.678104 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 18:55:31.678878 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.678854 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 18:55:31.678998 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.678857 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-h94k2\"" Apr 17 18:55:31.683375 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.683352 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-gm25f"] Apr 17 18:55:31.772579 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.772548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr8lh\" (UniqueName: \"kubernetes.io/projected/27295698-9dbd-43b1-8c59-7cbc863acf7c-kube-api-access-xr8lh\") pod \"cert-manager-cainjector-8966b78d4-gm25f\" (UID: \"27295698-9dbd-43b1-8c59-7cbc863acf7c\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" Apr 17 18:55:31.772746 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.772587 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27295698-9dbd-43b1-8c59-7cbc863acf7c-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-gm25f\" (UID: \"27295698-9dbd-43b1-8c59-7cbc863acf7c\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" Apr 17 18:55:31.873686 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.873650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27295698-9dbd-43b1-8c59-7cbc863acf7c-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-gm25f\" (UID: \"27295698-9dbd-43b1-8c59-7cbc863acf7c\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" Apr 17 18:55:31.873875 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.873746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr8lh\" (UniqueName: \"kubernetes.io/projected/27295698-9dbd-43b1-8c59-7cbc863acf7c-kube-api-access-xr8lh\") pod \"cert-manager-cainjector-8966b78d4-gm25f\" (UID: \"27295698-9dbd-43b1-8c59-7cbc863acf7c\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" Apr 17 18:55:31.886381 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.886356 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27295698-9dbd-43b1-8c59-7cbc863acf7c-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-gm25f\" (UID: \"27295698-9dbd-43b1-8c59-7cbc863acf7c\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" Apr 17 18:55:31.886634 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.886616 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr8lh\" (UniqueName: \"kubernetes.io/projected/27295698-9dbd-43b1-8c59-7cbc863acf7c-kube-api-access-xr8lh\") pod \"cert-manager-cainjector-8966b78d4-gm25f\" (UID: \"27295698-9dbd-43b1-8c59-7cbc863acf7c\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" Apr 17 18:55:31.985258 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:31.985171 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" Apr 17 18:55:32.101573 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:32.101539 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-gm25f"] Apr 17 18:55:32.104865 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:55:32.104837 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27295698_9dbd_43b1_8c59_7cbc863acf7c.slice/crio-ca68f8dd8906822f992be6663f47f5704a23ec391bcf1d3df3a0ffac54ae9b3c WatchSource:0}: Error finding container ca68f8dd8906822f992be6663f47f5704a23ec391bcf1d3df3a0ffac54ae9b3c: Status 404 returned error can't find the container with id ca68f8dd8906822f992be6663f47f5704a23ec391bcf1d3df3a0ffac54ae9b3c Apr 17 18:55:32.106746 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:32.106732 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:55:32.912061 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:32.912021 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" event={"ID":"27295698-9dbd-43b1-8c59-7cbc863acf7c","Type":"ContainerStarted","Data":"ca68f8dd8906822f992be6663f47f5704a23ec391bcf1d3df3a0ffac54ae9b3c"} Apr 17 18:55:35.921180 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:35.921142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" event={"ID":"27295698-9dbd-43b1-8c59-7cbc863acf7c","Type":"ContainerStarted","Data":"012596a043cb996d67d473a935ccb6ea806a4d286da8426a4c02d9bf32d8e98e"} Apr 17 18:55:35.944909 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:35.944855 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-gm25f" podStartSLOduration=1.6847166900000001 podStartE2EDuration="4.94483968s" podCreationTimestamp="2026-04-17 18:55:31 +0000 UTC" firstStartedPulling="2026-04-17 18:55:32.106874164 +0000 UTC m=+375.879618550" lastFinishedPulling="2026-04-17 18:55:35.366997144 +0000 UTC m=+379.139741540" observedRunningTime="2026-04-17 18:55:35.943144628 +0000 UTC m=+379.715889037" watchObservedRunningTime="2026-04-17 18:55:35.94483968 +0000 UTC m=+379.717584088" Apr 17 18:55:47.884936 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:47.884900 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-wd6dc"] Apr 17 18:55:47.886864 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:47.886841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-wd6dc" Apr 17 18:55:47.888971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:47.888949 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-gtkms\"" Apr 17 18:55:47.896353 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:47.896331 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-wd6dc"] Apr 17 18:55:47.996108 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:47.996067 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b837cdde-5aa8-4d3b-bb46-c9573e061451-bound-sa-token\") pod \"cert-manager-759f64656b-wd6dc\" (UID: \"b837cdde-5aa8-4d3b-bb46-c9573e061451\") " pod="cert-manager/cert-manager-759f64656b-wd6dc" Apr 17 18:55:47.996108 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:47.996115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxnqh\" (UniqueName: \"kubernetes.io/projected/b837cdde-5aa8-4d3b-bb46-c9573e061451-kube-api-access-nxnqh\") pod \"cert-manager-759f64656b-wd6dc\" (UID: \"b837cdde-5aa8-4d3b-bb46-c9573e061451\") " pod="cert-manager/cert-manager-759f64656b-wd6dc" Apr 17 18:55:48.097271 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.097234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxnqh\" (UniqueName: \"kubernetes.io/projected/b837cdde-5aa8-4d3b-bb46-c9573e061451-kube-api-access-nxnqh\") pod \"cert-manager-759f64656b-wd6dc\" (UID: \"b837cdde-5aa8-4d3b-bb46-c9573e061451\") " pod="cert-manager/cert-manager-759f64656b-wd6dc" Apr 17 18:55:48.097427 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.097312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b837cdde-5aa8-4d3b-bb46-c9573e061451-bound-sa-token\") pod \"cert-manager-759f64656b-wd6dc\" (UID: \"b837cdde-5aa8-4d3b-bb46-c9573e061451\") " pod="cert-manager/cert-manager-759f64656b-wd6dc" Apr 17 18:55:48.104845 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.104815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b837cdde-5aa8-4d3b-bb46-c9573e061451-bound-sa-token\") pod \"cert-manager-759f64656b-wd6dc\" (UID: \"b837cdde-5aa8-4d3b-bb46-c9573e061451\") " pod="cert-manager/cert-manager-759f64656b-wd6dc" Apr 17 18:55:48.104979 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.104885 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxnqh\" (UniqueName: \"kubernetes.io/projected/b837cdde-5aa8-4d3b-bb46-c9573e061451-kube-api-access-nxnqh\") pod \"cert-manager-759f64656b-wd6dc\" (UID: \"b837cdde-5aa8-4d3b-bb46-c9573e061451\") " pod="cert-manager/cert-manager-759f64656b-wd6dc" Apr 17 18:55:48.196040 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.195962 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-wd6dc" Apr 17 18:55:48.311802 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.311756 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-wd6dc"] Apr 17 18:55:48.314536 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:55:48.314504 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb837cdde_5aa8_4d3b_bb46_c9573e061451.slice/crio-3a629d213bfa45b7613c2969843f19c37739d511a2f2207d3365e4edeaee0594 WatchSource:0}: Error finding container 3a629d213bfa45b7613c2969843f19c37739d511a2f2207d3365e4edeaee0594: Status 404 returned error can't find the container with id 3a629d213bfa45b7613c2969843f19c37739d511a2f2207d3365e4edeaee0594 Apr 17 18:55:48.959891 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.959856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-wd6dc" event={"ID":"b837cdde-5aa8-4d3b-bb46-c9573e061451","Type":"ContainerStarted","Data":"915929bba80ba9abc2671145c533bb96778574be2e582ba853ed8601ccb05bbb"} Apr 17 18:55:48.959891 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.959891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-wd6dc" event={"ID":"b837cdde-5aa8-4d3b-bb46-c9573e061451","Type":"ContainerStarted","Data":"3a629d213bfa45b7613c2969843f19c37739d511a2f2207d3365e4edeaee0594"} Apr 17 18:55:48.974181 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:55:48.974135 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-wd6dc" podStartSLOduration=1.974120825 podStartE2EDuration="1.974120825s" podCreationTimestamp="2026-04-17 18:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:55:48.97260818 +0000 UTC m=+392.745352589" watchObservedRunningTime="2026-04-17 18:55:48.974120825 +0000 UTC m=+392.746865233" Apr 17 18:56:01.556415 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.556381 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg"] Apr 17 18:56:01.558504 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.558483 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.560692 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.560664 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 18:56:01.561134 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.561115 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-w5cl6\"" Apr 17 18:56:01.561220 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.561183 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 18:56:01.561418 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.561398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 18:56:01.561505 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.561406 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 18:56:01.575692 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.575669 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg"] Apr 17 18:56:01.693022 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.692983 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbt4g\" (UniqueName: \"kubernetes.io/projected/21834787-d3c0-4a75-b176-8640876eb579-kube-api-access-kbt4g\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.693249 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.693035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/21834787-d3c0-4a75-b176-8640876eb579-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.693249 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.693128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/21834787-d3c0-4a75-b176-8640876eb579-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.793654 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.793622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbt4g\" (UniqueName: \"kubernetes.io/projected/21834787-d3c0-4a75-b176-8640876eb579-kube-api-access-kbt4g\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.793856 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.793660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/21834787-d3c0-4a75-b176-8640876eb579-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.793856 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.793815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/21834787-d3c0-4a75-b176-8640876eb579-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.796293 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.796269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/21834787-d3c0-4a75-b176-8640876eb579-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.796415 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.796393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/21834787-d3c0-4a75-b176-8640876eb579-webhook-cert\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.802178 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.802153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbt4g\" (UniqueName: \"kubernetes.io/projected/21834787-d3c0-4a75-b176-8640876eb579-kube-api-access-kbt4g\") pod \"opendatahub-operator-controller-manager-6fc6488c9d-jdmcg\" (UID: \"21834787-d3c0-4a75-b176-8640876eb579\") " pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.869019 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.868948 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:01.991294 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:01.991264 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg"] Apr 17 18:56:01.995646 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:56:01.995618 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21834787_d3c0_4a75_b176_8640876eb579.slice/crio-4d65b6a024211195fe4323648c61ac1cbb335eeaf10be0ba1e2d1e16275710d0 WatchSource:0}: Error finding container 4d65b6a024211195fe4323648c61ac1cbb335eeaf10be0ba1e2d1e16275710d0: Status 404 returned error can't find the container with id 4d65b6a024211195fe4323648c61ac1cbb335eeaf10be0ba1e2d1e16275710d0 Apr 17 18:56:02.998928 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:02.998886 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" event={"ID":"21834787-d3c0-4a75-b176-8640876eb579","Type":"ContainerStarted","Data":"4d65b6a024211195fe4323648c61ac1cbb335eeaf10be0ba1e2d1e16275710d0"} Apr 17 18:56:05.005536 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:05.005496 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" event={"ID":"21834787-d3c0-4a75-b176-8640876eb579","Type":"ContainerStarted","Data":"a2d2423e70e8964537185584a9bdbd51cfc87b9efbca6349b23158616016da80"} Apr 17 18:56:05.005931 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:05.005599 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:05.025893 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:05.025843 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" podStartSLOduration=1.633957372 podStartE2EDuration="4.025828727s" podCreationTimestamp="2026-04-17 18:56:01 +0000 UTC" firstStartedPulling="2026-04-17 18:56:01.997368332 +0000 UTC m=+405.770112721" lastFinishedPulling="2026-04-17 18:56:04.389239687 +0000 UTC m=+408.161984076" observedRunningTime="2026-04-17 18:56:05.023983724 +0000 UTC m=+408.796728133" watchObservedRunningTime="2026-04-17 18:56:05.025828727 +0000 UTC m=+408.798573136" Apr 17 18:56:16.011261 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:16.011235 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6fc6488c9d-jdmcg" Apr 17 18:56:19.650621 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.650582 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x"] Apr 17 18:56:19.654987 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.654967 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.657258 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.657235 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 18:56:19.658079 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.658048 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 18:56:19.658079 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.658054 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 18:56:19.658229 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.658097 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-lr4dd\"" Apr 17 18:56:19.658229 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.658055 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 18:56:19.660969 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.660947 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x"] Apr 17 18:56:19.829649 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.829613 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-tls-certs\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.829852 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.829654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-tmp\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.829852 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.829691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4hdj\" (UniqueName: \"kubernetes.io/projected/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-kube-api-access-r4hdj\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.930978 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.930895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-tmp\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.930978 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.930939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4hdj\" (UniqueName: \"kubernetes.io/projected/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-kube-api-access-r4hdj\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.931201 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.931003 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-tls-certs\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.933285 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.933259 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-tmp\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.933503 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.933487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-tls-certs\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.938798 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.938757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4hdj\" (UniqueName: \"kubernetes.io/projected/a4c43ec7-2e95-4a42-97b3-88721e3ac0ab-kube-api-access-r4hdj\") pod \"kube-auth-proxy-59447f86f4-6dz8x\" (UID: \"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab\") " pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:19.964754 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:19.964730 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" Apr 17 18:56:20.086584 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:20.086555 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x"] Apr 17 18:56:20.091963 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:56:20.091928 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c43ec7_2e95_4a42_97b3_88721e3ac0ab.slice/crio-2436de917c2dd89975b43ef2ad029099a0071094167f88b0006fa249bfc1e96f WatchSource:0}: Error finding container 2436de917c2dd89975b43ef2ad029099a0071094167f88b0006fa249bfc1e96f: Status 404 returned error can't find the container with id 2436de917c2dd89975b43ef2ad029099a0071094167f88b0006fa249bfc1e96f Apr 17 18:56:21.049987 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:21.049947 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" event={"ID":"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab","Type":"ContainerStarted","Data":"2436de917c2dd89975b43ef2ad029099a0071094167f88b0006fa249bfc1e96f"} Apr 17 18:56:23.213402 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.213354 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-x8dsg"] Apr 17 18:56:23.215746 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.215720 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:23.217977 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.217948 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 18:56:23.218078 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.218050 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-j9t5g\"" Apr 17 18:56:23.224428 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.224406 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-x8dsg"] Apr 17 18:56:23.360612 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.360571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwmnj\" (UniqueName: \"kubernetes.io/projected/2b98dbea-d54e-49c7-9b26-6e037e8fe470-kube-api-access-vwmnj\") pod \"odh-model-controller-858dbf95b8-x8dsg\" (UID: \"2b98dbea-d54e-49c7-9b26-6e037e8fe470\") " pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:23.360817 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.360631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b98dbea-d54e-49c7-9b26-6e037e8fe470-cert\") pod \"odh-model-controller-858dbf95b8-x8dsg\" (UID: \"2b98dbea-d54e-49c7-9b26-6e037e8fe470\") " pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:23.461270 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.461229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b98dbea-d54e-49c7-9b26-6e037e8fe470-cert\") pod \"odh-model-controller-858dbf95b8-x8dsg\" (UID: \"2b98dbea-d54e-49c7-9b26-6e037e8fe470\") " pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:23.461453 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.461342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwmnj\" (UniqueName: \"kubernetes.io/projected/2b98dbea-d54e-49c7-9b26-6e037e8fe470-kube-api-access-vwmnj\") pod \"odh-model-controller-858dbf95b8-x8dsg\" (UID: \"2b98dbea-d54e-49c7-9b26-6e037e8fe470\") " pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:23.461453 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:56:23.461412 2574 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 18:56:23.461560 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:56:23.461489 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b98dbea-d54e-49c7-9b26-6e037e8fe470-cert podName:2b98dbea-d54e-49c7-9b26-6e037e8fe470 nodeName:}" failed. No retries permitted until 2026-04-17 18:56:23.961470594 +0000 UTC m=+427.734214982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b98dbea-d54e-49c7-9b26-6e037e8fe470-cert") pod "odh-model-controller-858dbf95b8-x8dsg" (UID: "2b98dbea-d54e-49c7-9b26-6e037e8fe470") : secret "odh-model-controller-webhook-cert" not found Apr 17 18:56:23.472554 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.472494 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwmnj\" (UniqueName: \"kubernetes.io/projected/2b98dbea-d54e-49c7-9b26-6e037e8fe470-kube-api-access-vwmnj\") pod \"odh-model-controller-858dbf95b8-x8dsg\" (UID: \"2b98dbea-d54e-49c7-9b26-6e037e8fe470\") " pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:23.965115 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.965081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b98dbea-d54e-49c7-9b26-6e037e8fe470-cert\") pod \"odh-model-controller-858dbf95b8-x8dsg\" (UID: \"2b98dbea-d54e-49c7-9b26-6e037e8fe470\") " pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:23.967521 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:23.967500 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b98dbea-d54e-49c7-9b26-6e037e8fe470-cert\") pod \"odh-model-controller-858dbf95b8-x8dsg\" (UID: \"2b98dbea-d54e-49c7-9b26-6e037e8fe470\") " pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:24.058509 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:24.058467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" event={"ID":"a4c43ec7-2e95-4a42-97b3-88721e3ac0ab","Type":"ContainerStarted","Data":"57b04a0a664fa5f1117688aaab21ab4fd5a980269aa2c931808fc41d31a2d8a2"} Apr 17 18:56:24.073555 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:24.073462 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-59447f86f4-6dz8x" podStartSLOduration=1.34959487 podStartE2EDuration="5.073447246s" podCreationTimestamp="2026-04-17 18:56:19 +0000 UTC" firstStartedPulling="2026-04-17 18:56:20.094566762 +0000 UTC m=+423.867311162" lastFinishedPulling="2026-04-17 18:56:23.81841915 +0000 UTC m=+427.591163538" observedRunningTime="2026-04-17 18:56:24.071985855 +0000 UTC m=+427.844730299" watchObservedRunningTime="2026-04-17 18:56:24.073447246 +0000 UTC m=+427.846191719" Apr 17 18:56:24.129079 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:24.129041 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:24.258498 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:24.253364 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-x8dsg"] Apr 17 18:56:24.258498 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:56:24.255528 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b98dbea_d54e_49c7_9b26_6e037e8fe470.slice/crio-14fcea74daa4de68fe68cc3d234b6c95e6fe9214590fdab68026f54043cca9f3 WatchSource:0}: Error finding container 14fcea74daa4de68fe68cc3d234b6c95e6fe9214590fdab68026f54043cca9f3: Status 404 returned error can't find the container with id 14fcea74daa4de68fe68cc3d234b6c95e6fe9214590fdab68026f54043cca9f3 Apr 17 18:56:25.063818 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:25.063779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" event={"ID":"2b98dbea-d54e-49c7-9b26-6e037e8fe470","Type":"ContainerStarted","Data":"14fcea74daa4de68fe68cc3d234b6c95e6fe9214590fdab68026f54043cca9f3"} Apr 17 18:56:28.075855 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:28.075816 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b98dbea-d54e-49c7-9b26-6e037e8fe470" containerID="2c97699731c0ea2e9e40e69047304b734d48e2faecbc67321e455ce2fed92ddd" exitCode=1 Apr 17 18:56:28.076253 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:28.075905 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" event={"ID":"2b98dbea-d54e-49c7-9b26-6e037e8fe470","Type":"ContainerDied","Data":"2c97699731c0ea2e9e40e69047304b734d48e2faecbc67321e455ce2fed92ddd"} Apr 17 18:56:28.076253 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:28.076110 2574 scope.go:117] "RemoveContainer" containerID="2c97699731c0ea2e9e40e69047304b734d48e2faecbc67321e455ce2fed92ddd" Apr 17 18:56:29.080697 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.080657 2574 generic.go:358] "Generic (PLEG): container finished" podID="2b98dbea-d54e-49c7-9b26-6e037e8fe470" containerID="0de9ca88fa082c07de34465316b2842b239fc5c9a31357fcd33e4318ca99006f" exitCode=1 Apr 17 18:56:29.081211 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.080735 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" event={"ID":"2b98dbea-d54e-49c7-9b26-6e037e8fe470","Type":"ContainerDied","Data":"0de9ca88fa082c07de34465316b2842b239fc5c9a31357fcd33e4318ca99006f"} Apr 17 18:56:29.081211 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.080804 2574 scope.go:117] "RemoveContainer" containerID="2c97699731c0ea2e9e40e69047304b734d48e2faecbc67321e455ce2fed92ddd" Apr 17 18:56:29.081211 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.081084 2574 scope.go:117] "RemoveContainer" containerID="0de9ca88fa082c07de34465316b2842b239fc5c9a31357fcd33e4318ca99006f" Apr 17 18:56:29.081358 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:56:29.081311 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-x8dsg_opendatahub(2b98dbea-d54e-49c7-9b26-6e037e8fe470)\"" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" podUID="2b98dbea-d54e-49c7-9b26-6e037e8fe470" Apr 17 18:56:29.557808 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.557782 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-kbp4l"] Apr 17 18:56:29.560193 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.560178 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:29.562688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.562652 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-xvpms\"" Apr 17 18:56:29.562885 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.562661 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 18:56:29.571065 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.571042 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-kbp4l"] Apr 17 18:56:29.711985 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.711939 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff0265b3-8593-4d8b-9da4-d3f26de60afc-cert\") pod \"kserve-controller-manager-856948b99f-kbp4l\" (UID: \"ff0265b3-8593-4d8b-9da4-d3f26de60afc\") " pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:29.712166 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.712059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5877\" (UniqueName: \"kubernetes.io/projected/ff0265b3-8593-4d8b-9da4-d3f26de60afc-kube-api-access-f5877\") pod \"kserve-controller-manager-856948b99f-kbp4l\" (UID: \"ff0265b3-8593-4d8b-9da4-d3f26de60afc\") " pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:29.812541 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.812451 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff0265b3-8593-4d8b-9da4-d3f26de60afc-cert\") pod \"kserve-controller-manager-856948b99f-kbp4l\" (UID: \"ff0265b3-8593-4d8b-9da4-d3f26de60afc\") " pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:29.812541 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.812508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5877\" (UniqueName: \"kubernetes.io/projected/ff0265b3-8593-4d8b-9da4-d3f26de60afc-kube-api-access-f5877\") pod \"kserve-controller-manager-856948b99f-kbp4l\" (UID: \"ff0265b3-8593-4d8b-9da4-d3f26de60afc\") " pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:29.812750 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:56:29.812601 2574 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 18:56:29.812750 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:56:29.812672 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff0265b3-8593-4d8b-9da4-d3f26de60afc-cert podName:ff0265b3-8593-4d8b-9da4-d3f26de60afc nodeName:}" failed. No retries permitted until 2026-04-17 18:56:30.312655504 +0000 UTC m=+434.085399890 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff0265b3-8593-4d8b-9da4-d3f26de60afc-cert") pod "kserve-controller-manager-856948b99f-kbp4l" (UID: "ff0265b3-8593-4d8b-9da4-d3f26de60afc") : secret "kserve-webhook-server-cert" not found Apr 17 18:56:29.821318 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:29.821287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5877\" (UniqueName: \"kubernetes.io/projected/ff0265b3-8593-4d8b-9da4-d3f26de60afc-kube-api-access-f5877\") pod \"kserve-controller-manager-856948b99f-kbp4l\" (UID: \"ff0265b3-8593-4d8b-9da4-d3f26de60afc\") " pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:30.085527 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:30.085455 2574 scope.go:117] "RemoveContainer" containerID="0de9ca88fa082c07de34465316b2842b239fc5c9a31357fcd33e4318ca99006f" Apr 17 18:56:30.085903 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:56:30.085638 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-x8dsg_opendatahub(2b98dbea-d54e-49c7-9b26-6e037e8fe470)\"" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" podUID="2b98dbea-d54e-49c7-9b26-6e037e8fe470" Apr 17 18:56:30.317512 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:30.317470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff0265b3-8593-4d8b-9da4-d3f26de60afc-cert\") pod \"kserve-controller-manager-856948b99f-kbp4l\" (UID: \"ff0265b3-8593-4d8b-9da4-d3f26de60afc\") " pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:30.319987 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:30.319962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff0265b3-8593-4d8b-9da4-d3f26de60afc-cert\") pod \"kserve-controller-manager-856948b99f-kbp4l\" (UID: \"ff0265b3-8593-4d8b-9da4-d3f26de60afc\") " pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:30.471974 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:30.471876 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:30.600113 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:30.600083 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-kbp4l"] Apr 17 18:56:30.602714 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:56:30.602688 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0265b3_8593_4d8b_9da4_d3f26de60afc.slice/crio-24a967c71a47fcb55187deb3017b2c3aff69127a976b7cbf8cc5f261be60bdb9 WatchSource:0}: Error finding container 24a967c71a47fcb55187deb3017b2c3aff69127a976b7cbf8cc5f261be60bdb9: Status 404 returned error can't find the container with id 24a967c71a47fcb55187deb3017b2c3aff69127a976b7cbf8cc5f261be60bdb9 Apr 17 18:56:31.089834 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:31.089801 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" event={"ID":"ff0265b3-8593-4d8b-9da4-d3f26de60afc","Type":"ContainerStarted","Data":"24a967c71a47fcb55187deb3017b2c3aff69127a976b7cbf8cc5f261be60bdb9"} Apr 17 18:56:34.100784 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:34.100731 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" event={"ID":"ff0265b3-8593-4d8b-9da4-d3f26de60afc","Type":"ContainerStarted","Data":"f1847af7fbbeb257f709f8cf03a7fdba8e661ee59989395b564a8cfd62cd53b7"} Apr 17 18:56:34.101295 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:34.100900 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:56:34.117584 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:34.117533 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" podStartSLOduration=2.4833006 podStartE2EDuration="5.117516729s" podCreationTimestamp="2026-04-17 18:56:29 +0000 UTC" firstStartedPulling="2026-04-17 18:56:30.604438491 +0000 UTC m=+434.377182877" lastFinishedPulling="2026-04-17 18:56:33.238654611 +0000 UTC m=+437.011399006" observedRunningTime="2026-04-17 18:56:34.115907316 +0000 UTC m=+437.888651727" watchObservedRunningTime="2026-04-17 18:56:34.117516729 +0000 UTC m=+437.890261138" Apr 17 18:56:34.129931 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:34.129905 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:34.130255 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:34.130242 2574 scope.go:117] "RemoveContainer" containerID="0de9ca88fa082c07de34465316b2842b239fc5c9a31357fcd33e4318ca99006f" Apr 17 18:56:34.130430 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:56:34.130410 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-x8dsg_opendatahub(2b98dbea-d54e-49c7-9b26-6e037e8fe470)\"" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" podUID="2b98dbea-d54e-49c7-9b26-6e037e8fe470" Apr 17 18:56:37.743970 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.743892 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd"] Apr 17 18:56:37.746972 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.746956 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:37.749230 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.749208 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 18:56:37.749381 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.749363 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 18:56:37.749722 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.749707 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-8s9vs\"" Apr 17 18:56:37.759300 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.759279 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd"] Apr 17 18:56:37.774947 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.774922 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/bb70bffd-9eec-416b-b544-c2bc44c678dd-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g7zqd\" (UID: \"bb70bffd-9eec-416b-b544-c2bc44c678dd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:37.775054 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.775021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmfmc\" (UniqueName: \"kubernetes.io/projected/bb70bffd-9eec-416b-b544-c2bc44c678dd-kube-api-access-rmfmc\") pod \"servicemesh-operator3-55f49c5f94-g7zqd\" (UID: \"bb70bffd-9eec-416b-b544-c2bc44c678dd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:37.876194 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.876140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/bb70bffd-9eec-416b-b544-c2bc44c678dd-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g7zqd\" (UID: \"bb70bffd-9eec-416b-b544-c2bc44c678dd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:37.876358 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.876238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmfmc\" (UniqueName: \"kubernetes.io/projected/bb70bffd-9eec-416b-b544-c2bc44c678dd-kube-api-access-rmfmc\") pod \"servicemesh-operator3-55f49c5f94-g7zqd\" (UID: \"bb70bffd-9eec-416b-b544-c2bc44c678dd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:37.878743 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.878715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/bb70bffd-9eec-416b-b544-c2bc44c678dd-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g7zqd\" (UID: \"bb70bffd-9eec-416b-b544-c2bc44c678dd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:37.885749 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:37.885723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmfmc\" (UniqueName: \"kubernetes.io/projected/bb70bffd-9eec-416b-b544-c2bc44c678dd-kube-api-access-rmfmc\") pod \"servicemesh-operator3-55f49c5f94-g7zqd\" (UID: \"bb70bffd-9eec-416b-b544-c2bc44c678dd\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:38.056175 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:38.056080 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:38.189538 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:38.189506 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd"] Apr 17 18:56:38.192710 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:56:38.192682 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb70bffd_9eec_416b_b544_c2bc44c678dd.slice/crio-89ed040c90dc7febc5cc6c5408f36885db40348fcf4144b560726277c8b6a668 WatchSource:0}: Error finding container 89ed040c90dc7febc5cc6c5408f36885db40348fcf4144b560726277c8b6a668: Status 404 returned error can't find the container with id 89ed040c90dc7febc5cc6c5408f36885db40348fcf4144b560726277c8b6a668 Apr 17 18:56:39.118209 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:39.118167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" event={"ID":"bb70bffd-9eec-416b-b544-c2bc44c678dd","Type":"ContainerStarted","Data":"89ed040c90dc7febc5cc6c5408f36885db40348fcf4144b560726277c8b6a668"} Apr 17 18:56:41.126971 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:41.126928 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" event={"ID":"bb70bffd-9eec-416b-b544-c2bc44c678dd","Type":"ContainerStarted","Data":"8aa73d8035c3c527e710ec8329f46ff3597f0e2dc0e4f5872b1f29c79fc18339"} Apr 17 18:56:41.127364 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:41.127044 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:41.148817 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:41.148737 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" podStartSLOduration=1.726693891 podStartE2EDuration="4.148720859s" podCreationTimestamp="2026-04-17 18:56:37 +0000 UTC" firstStartedPulling="2026-04-17 18:56:38.195142664 +0000 UTC m=+441.967887050" lastFinishedPulling="2026-04-17 18:56:40.617169632 +0000 UTC m=+444.389914018" observedRunningTime="2026-04-17 18:56:41.147640279 +0000 UTC m=+444.920384687" watchObservedRunningTime="2026-04-17 18:56:41.148720859 +0000 UTC m=+444.921465269" Apr 17 18:56:44.129623 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:44.129583 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:44.130035 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:44.129985 2574 scope.go:117] "RemoveContainer" containerID="0de9ca88fa082c07de34465316b2842b239fc5c9a31357fcd33e4318ca99006f" Apr 17 18:56:45.141748 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:45.141706 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" event={"ID":"2b98dbea-d54e-49c7-9b26-6e037e8fe470","Type":"ContainerStarted","Data":"3359b99623fabf0c4e50b2b34d014add57643ae7f552b7833e986ef5e45af672"} Apr 17 18:56:45.142186 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:45.141966 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:56:45.157447 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:45.157384 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" podStartSLOduration=1.9685403190000001 podStartE2EDuration="22.15736972s" podCreationTimestamp="2026-04-17 18:56:23 +0000 UTC" firstStartedPulling="2026-04-17 18:56:24.257303276 +0000 UTC m=+428.030047666" lastFinishedPulling="2026-04-17 18:56:44.446132678 +0000 UTC m=+448.218877067" observedRunningTime="2026-04-17 18:56:45.156393752 +0000 UTC m=+448.929138160" watchObservedRunningTime="2026-04-17 18:56:45.15736972 +0000 UTC m=+448.930114129" Apr 17 18:56:52.132314 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:52.132283 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g7zqd" Apr 17 18:56:56.147383 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:56:56.147352 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-x8dsg" Apr 17 18:57:05.108898 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:05.108864 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-kbp4l" Apr 17 18:57:08.247720 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.247679 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj"] Apr 17 18:57:08.250040 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.250019 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.252192 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.252168 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-bvg94\"" Apr 17 18:57:08.252192 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.252189 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 18:57:08.252357 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.252210 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 18:57:08.252357 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.252192 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 18:57:08.252357 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.252188 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 18:57:08.265156 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.265134 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj"] Apr 17 18:57:08.318287 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.318253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.318440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.318300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aec37621-49b9-45c4-b246-325387f61042-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.318440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.318359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aec37621-49b9-45c4-b246-325387f61042-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.318440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.318376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.318440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.318413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.318440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.318435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aec37621-49b9-45c4-b246-325387f61042-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.318601 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.318454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngf4s\" (UniqueName: \"kubernetes.io/projected/aec37621-49b9-45c4-b246-325387f61042-kube-api-access-ngf4s\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.419575 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.419542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aec37621-49b9-45c4-b246-325387f61042-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.419575 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.419580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aec37621-49b9-45c4-b246-325387f61042-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.419859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.419600 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.419859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.419627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.419859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.419715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aec37621-49b9-45c4-b246-325387f61042-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.419859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.419747 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngf4s\" (UniqueName: \"kubernetes.io/projected/aec37621-49b9-45c4-b246-325387f61042-kube-api-access-ngf4s\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.420050 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.419871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.420687 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.420632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/aec37621-49b9-45c4-b246-325387f61042-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.422472 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.422445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.422570 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.422468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/aec37621-49b9-45c4-b246-325387f61042-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.422570 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.422462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.422665 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.422635 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/aec37621-49b9-45c4-b246-325387f61042-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.427195 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.427168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/aec37621-49b9-45c4-b246-325387f61042-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.427474 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.427454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngf4s\" (UniqueName: \"kubernetes.io/projected/aec37621-49b9-45c4-b246-325387f61042-kube-api-access-ngf4s\") pod \"istiod-openshift-gateway-55ff986f96-h9dbj\" (UID: \"aec37621-49b9-45c4-b246-325387f61042\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.560102 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.560065 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:08.692823 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:08.692786 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj"] Apr 17 18:57:08.695730 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:57:08.695701 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaec37621_49b9_45c4_b246_325387f61042.slice/crio-98be11cf3fa18b2000ef6753ef84f4bc3a4815a72b9aeb760f5a9a02e7994470 WatchSource:0}: Error finding container 98be11cf3fa18b2000ef6753ef84f4bc3a4815a72b9aeb760f5a9a02e7994470: Status 404 returned error can't find the container with id 98be11cf3fa18b2000ef6753ef84f4bc3a4815a72b9aeb760f5a9a02e7994470 Apr 17 18:57:09.218253 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:09.218218 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" event={"ID":"aec37621-49b9-45c4-b246-325387f61042","Type":"ContainerStarted","Data":"98be11cf3fa18b2000ef6753ef84f4bc3a4815a72b9aeb760f5a9a02e7994470"} Apr 17 18:57:11.788859 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:11.788818 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 17 18:57:11.789205 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:11.788891 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 17 18:57:12.230537 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:12.230501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" event={"ID":"aec37621-49b9-45c4-b246-325387f61042","Type":"ContainerStarted","Data":"b30ef69c37025ec07226cc50edd2bf3f4a6fc7f79a8d0039f4e16649a3698a20"} Apr 17 18:57:12.230782 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:12.230741 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:12.232474 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:12.232452 2574 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-h9dbj container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 18:57:12.232532 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:12.232501 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" podUID="aec37621-49b9-45c4-b246-325387f61042" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:57:12.259094 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:12.259034 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" podStartSLOduration=1.168110246 podStartE2EDuration="4.25901635s" podCreationTimestamp="2026-04-17 18:57:08 +0000 UTC" firstStartedPulling="2026-04-17 18:57:08.697653666 +0000 UTC m=+472.470398057" lastFinishedPulling="2026-04-17 18:57:11.788559775 +0000 UTC m=+475.561304161" observedRunningTime="2026-04-17 18:57:12.257647383 +0000 UTC m=+476.030391791" watchObservedRunningTime="2026-04-17 18:57:12.25901635 +0000 UTC m=+476.031760759" Apr 17 18:57:13.234440 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:13.234416 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-h9dbj" Apr 17 18:57:57.485517 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.485477 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd"] Apr 17 18:57:57.493041 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.493018 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:57:57.495307 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.495283 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 18:57:57.495451 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.495432 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 18:57:57.496218 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.496201 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-9vfjd\"" Apr 17 18:57:57.500434 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.500409 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd"] Apr 17 18:57:57.616950 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.616915 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frqhs\" (UniqueName: \"kubernetes.io/projected/75f4bbab-1975-4b0b-b522-2bb610a71905-kube-api-access-frqhs\") pod \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" (UID: \"75f4bbab-1975-4b0b-b522-2bb610a71905\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:57:57.717800 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.717723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frqhs\" (UniqueName: \"kubernetes.io/projected/75f4bbab-1975-4b0b-b522-2bb610a71905-kube-api-access-frqhs\") pod \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" (UID: \"75f4bbab-1975-4b0b-b522-2bb610a71905\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:57:57.725657 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.725620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frqhs\" (UniqueName: \"kubernetes.io/projected/75f4bbab-1975-4b0b-b522-2bb610a71905-kube-api-access-frqhs\") pod \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" (UID: \"75f4bbab-1975-4b0b-b522-2bb610a71905\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:57:57.804806 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.804691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:57:57.928877 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:57.928845 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd"] Apr 17 18:57:57.932200 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:57:57.932175 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75f4bbab_1975_4b0b_b522_2bb610a71905.slice/crio-d8362377ab67d89d4cbabaccd2c5c717d00e7cbe39bd5e301015d1656bb2f155 WatchSource:0}: Error finding container d8362377ab67d89d4cbabaccd2c5c717d00e7cbe39bd5e301015d1656bb2f155: Status 404 returned error can't find the container with id d8362377ab67d89d4cbabaccd2c5c717d00e7cbe39bd5e301015d1656bb2f155 Apr 17 18:57:58.384992 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:57:58.384956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" event={"ID":"75f4bbab-1975-4b0b-b522-2bb610a71905","Type":"ContainerStarted","Data":"d8362377ab67d89d4cbabaccd2c5c717d00e7cbe39bd5e301015d1656bb2f155"} Apr 17 18:58:00.394468 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:00.394420 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" event={"ID":"75f4bbab-1975-4b0b-b522-2bb610a71905","Type":"ContainerStarted","Data":"cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc"} Apr 17 18:58:00.394993 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:00.394651 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:58:00.419911 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:00.419853 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" podStartSLOduration=1.374885569 podStartE2EDuration="3.419840622s" podCreationTimestamp="2026-04-17 18:57:57 +0000 UTC" firstStartedPulling="2026-04-17 18:57:57.933963396 +0000 UTC m=+521.706707789" lastFinishedPulling="2026-04-17 18:57:59.978918457 +0000 UTC m=+523.751662842" observedRunningTime="2026-04-17 18:58:00.418424392 +0000 UTC m=+524.191168800" watchObservedRunningTime="2026-04-17 18:58:00.419840622 +0000 UTC m=+524.192585030" Apr 17 18:58:11.400179 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:11.400150 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:58:12.854663 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.854626 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd"] Apr 17 18:58:12.856065 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.856027 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" containerName="manager" containerID="cri-o://cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc" gracePeriod=2 Apr 17 18:58:12.860542 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.860520 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd"] Apr 17 18:58:12.876151 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.876127 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v"] Apr 17 18:58:12.876415 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.876404 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" containerName="manager" Apr 17 18:58:12.876457 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.876418 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" containerName="manager" Apr 17 18:58:12.876490 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.876467 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" containerName="manager" Apr 17 18:58:12.879032 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.879012 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" Apr 17 18:58:12.881542 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.881513 2574 status_manager.go:895] "Failed to get status for pod" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" is forbidden: User \"system:node:ip-10-0-141-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-118.ec2.internal' and this object" Apr 17 18:58:12.893454 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.893433 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v"] Apr 17 18:58:12.950276 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:12.950244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdz7\" (UniqueName: \"kubernetes.io/projected/6fc9c869-38cf-4930-bfe8-52531b4ea284-kube-api-access-5mdz7\") pod \"limitador-operator-controller-manager-85c4996f8c-kfc2v\" (UID: \"6fc9c869-38cf-4930-bfe8-52531b4ea284\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" Apr 17 18:58:13.051729 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.051699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdz7\" (UniqueName: \"kubernetes.io/projected/6fc9c869-38cf-4930-bfe8-52531b4ea284-kube-api-access-5mdz7\") pod \"limitador-operator-controller-manager-85c4996f8c-kfc2v\" (UID: \"6fc9c869-38cf-4930-bfe8-52531b4ea284\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" Apr 17 18:58:13.060355 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.060328 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdz7\" (UniqueName: \"kubernetes.io/projected/6fc9c869-38cf-4930-bfe8-52531b4ea284-kube-api-access-5mdz7\") pod \"limitador-operator-controller-manager-85c4996f8c-kfc2v\" (UID: \"6fc9c869-38cf-4930-bfe8-52531b4ea284\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" Apr 17 18:58:13.078242 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.078222 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:58:13.080353 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.080330 2574 status_manager.go:895] "Failed to get status for pod" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" is forbidden: User \"system:node:ip-10-0-141-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-118.ec2.internal' and this object" Apr 17 18:58:13.152885 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.152810 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frqhs\" (UniqueName: \"kubernetes.io/projected/75f4bbab-1975-4b0b-b522-2bb610a71905-kube-api-access-frqhs\") pod \"75f4bbab-1975-4b0b-b522-2bb610a71905\" (UID: \"75f4bbab-1975-4b0b-b522-2bb610a71905\") " Apr 17 18:58:13.155062 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.155021 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75f4bbab-1975-4b0b-b522-2bb610a71905-kube-api-access-frqhs" (OuterVolumeSpecName: "kube-api-access-frqhs") pod "75f4bbab-1975-4b0b-b522-2bb610a71905" (UID: "75f4bbab-1975-4b0b-b522-2bb610a71905"). InnerVolumeSpecName "kube-api-access-frqhs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:58:13.234992 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.234958 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" Apr 17 18:58:13.254110 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.254075 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frqhs\" (UniqueName: \"kubernetes.io/projected/75f4bbab-1975-4b0b-b522-2bb610a71905-kube-api-access-frqhs\") on node \"ip-10-0-141-118.ec2.internal\" DevicePath \"\"" Apr 17 18:58:13.370525 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.370405 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v"] Apr 17 18:58:13.373176 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:58:13.373150 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc9c869_38cf_4930_bfe8_52531b4ea284.slice/crio-aa70cb8101b2e7c55302ae97dcd2a3cbe2f6dbfc238e1e51cab971cf9831a487 WatchSource:0}: Error finding container aa70cb8101b2e7c55302ae97dcd2a3cbe2f6dbfc238e1e51cab971cf9831a487: Status 404 returned error can't find the container with id aa70cb8101b2e7c55302ae97dcd2a3cbe2f6dbfc238e1e51cab971cf9831a487 Apr 17 18:58:13.437152 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.437111 2574 generic.go:358] "Generic (PLEG): container finished" podID="75f4bbab-1975-4b0b-b522-2bb610a71905" containerID="cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc" exitCode=0 Apr 17 18:58:13.437286 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.437161 2574 scope.go:117] "RemoveContainer" containerID="cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc" Apr 17 18:58:13.437286 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.437174 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" Apr 17 18:58:13.439034 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.439006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" event={"ID":"6fc9c869-38cf-4930-bfe8-52531b4ea284","Type":"ContainerStarted","Data":"aa70cb8101b2e7c55302ae97dcd2a3cbe2f6dbfc238e1e51cab971cf9831a487"} Apr 17 18:58:13.439657 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.439633 2574 status_manager.go:895] "Failed to get status for pod" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" is forbidden: User \"system:node:ip-10-0-141-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-118.ec2.internal' and this object" Apr 17 18:58:13.445412 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.445385 2574 scope.go:117] "RemoveContainer" containerID="cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc" Apr 17 18:58:13.445714 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:58:13.445685 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc\": container with ID starting with cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc not found: ID does not exist" containerID="cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc" Apr 17 18:58:13.445854 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.445727 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc"} err="failed to get container status \"cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc\": rpc error: code = NotFound desc = could not find container \"cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc\": container with ID starting with cb803a440979e1cefaed2ff47275b17177ba7e650c7655f93920fe5db8ebeedc not found: ID does not exist" Apr 17 18:58:13.448383 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:13.448356 2574 status_manager.go:895] "Failed to get status for pod" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" is forbidden: User \"system:node:ip-10-0-141-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-118.ec2.internal' and this object" Apr 17 18:58:14.114547 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.114514 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4"] Apr 17 18:58:14.119142 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.119114 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:14.121210 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.121180 2574 status_manager.go:895] "Failed to get status for pod" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" is forbidden: User \"system:node:ip-10-0-141-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-118.ec2.internal' and this object" Apr 17 18:58:14.121395 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.121380 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rrr25\"" Apr 17 18:58:14.131189 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.131169 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4"] Apr 17 18:58:14.262696 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.262657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e65c38fe-8e49-4137-aeef-7495a355fb4f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-55tv4\" (UID: \"e65c38fe-8e49-4137-aeef-7495a355fb4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:14.262915 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.262708 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p2jd\" (UniqueName: \"kubernetes.io/projected/e65c38fe-8e49-4137-aeef-7495a355fb4f-kube-api-access-8p2jd\") pod \"kuadrant-operator-controller-manager-55c7f4c975-55tv4\" (UID: \"e65c38fe-8e49-4137-aeef-7495a355fb4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:14.363517 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.363478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p2jd\" (UniqueName: \"kubernetes.io/projected/e65c38fe-8e49-4137-aeef-7495a355fb4f-kube-api-access-8p2jd\") pod \"kuadrant-operator-controller-manager-55c7f4c975-55tv4\" (UID: \"e65c38fe-8e49-4137-aeef-7495a355fb4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:14.363657 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.363557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e65c38fe-8e49-4137-aeef-7495a355fb4f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-55tv4\" (UID: \"e65c38fe-8e49-4137-aeef-7495a355fb4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:14.363903 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.363879 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e65c38fe-8e49-4137-aeef-7495a355fb4f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-55tv4\" (UID: \"e65c38fe-8e49-4137-aeef-7495a355fb4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:14.372443 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.372384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p2jd\" (UniqueName: \"kubernetes.io/projected/e65c38fe-8e49-4137-aeef-7495a355fb4f-kube-api-access-8p2jd\") pod \"kuadrant-operator-controller-manager-55c7f4c975-55tv4\" (UID: \"e65c38fe-8e49-4137-aeef-7495a355fb4f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:14.429380 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.429343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:14.444577 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.444540 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" event={"ID":"6fc9c869-38cf-4930-bfe8-52531b4ea284","Type":"ContainerStarted","Data":"346ea1645e6e785f38a059e8dbd8c4fddc7181c5839b896e240b64857899748e"} Apr 17 18:58:14.444701 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.444650 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" Apr 17 18:58:14.446797 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.446753 2574 status_manager.go:895] "Failed to get status for pod" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdlrd" err="pods \"limitador-operator-controller-manager-85c4996f8c-xdlrd\" is forbidden: User \"system:node:ip-10-0-141-118.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-118.ec2.internal' and this object" Apr 17 18:58:14.462322 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.462274 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" podStartSLOduration=2.462258525 podStartE2EDuration="2.462258525s" podCreationTimestamp="2026-04-17 18:58:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:58:14.46092746 +0000 UTC m=+538.233671868" watchObservedRunningTime="2026-04-17 18:58:14.462258525 +0000 UTC m=+538.235002934" Apr 17 18:58:14.577143 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:58:14.577111 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode65c38fe_8e49_4137_aeef_7495a355fb4f.slice/crio-0992a843f329bd952636061be5ec5e6b7808c77f7acb356eed38b4e8bea6f54c WatchSource:0}: Error finding container 0992a843f329bd952636061be5ec5e6b7808c77f7acb356eed38b4e8bea6f54c: Status 404 returned error can't find the container with id 0992a843f329bd952636061be5ec5e6b7808c77f7acb356eed38b4e8bea6f54c Apr 17 18:58:14.582207 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.582180 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4"] Apr 17 18:58:14.793050 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:14.792957 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75f4bbab-1975-4b0b-b522-2bb610a71905" path="/var/lib/kubelet/pods/75f4bbab-1975-4b0b-b522-2bb610a71905/volumes" Apr 17 18:58:15.450186 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:15.450153 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" event={"ID":"e65c38fe-8e49-4137-aeef-7495a355fb4f","Type":"ContainerStarted","Data":"0992a843f329bd952636061be5ec5e6b7808c77f7acb356eed38b4e8bea6f54c"} Apr 17 18:58:19.464902 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:19.464861 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" event={"ID":"e65c38fe-8e49-4137-aeef-7495a355fb4f","Type":"ContainerStarted","Data":"a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed"} Apr 17 18:58:19.465290 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:19.464932 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:58:19.495016 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:19.494961 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" podStartSLOduration=1.510147932 podStartE2EDuration="5.494945263s" podCreationTimestamp="2026-04-17 18:58:14 +0000 UTC" firstStartedPulling="2026-04-17 18:58:14.579792981 +0000 UTC m=+538.352537382" lastFinishedPulling="2026-04-17 18:58:18.56459032 +0000 UTC m=+542.337334713" observedRunningTime="2026-04-17 18:58:19.494118186 +0000 UTC m=+543.266862595" watchObservedRunningTime="2026-04-17 18:58:19.494945263 +0000 UTC m=+543.267689670" Apr 17 18:58:25.453130 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:25.453100 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kfc2v" Apr 17 18:58:30.470972 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:58:30.470936 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 18:59:16.693182 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:16.693149 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 18:59:16.694136 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:16.694109 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 18:59:20.225061 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.225028 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5d6cb9fd64-fkv5k"] Apr 17 18:59:20.228428 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.228413 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:20.230860 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.230835 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 18:59:20.230990 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.230875 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wjcpc\"" Apr 17 18:59:20.235364 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.235339 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5d6cb9fd64-fkv5k"] Apr 17 18:59:20.329461 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.329430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spgm9\" (UniqueName: \"kubernetes.io/projected/6df16b93-f167-4cbe-a9da-e5fcb125d176-kube-api-access-spgm9\") pod \"maas-controller-5d6cb9fd64-fkv5k\" (UID: \"6df16b93-f167-4cbe-a9da-e5fcb125d176\") " pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:20.430865 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.430829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spgm9\" (UniqueName: \"kubernetes.io/projected/6df16b93-f167-4cbe-a9da-e5fcb125d176-kube-api-access-spgm9\") pod \"maas-controller-5d6cb9fd64-fkv5k\" (UID: \"6df16b93-f167-4cbe-a9da-e5fcb125d176\") " pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:20.438226 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.438204 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spgm9\" (UniqueName: \"kubernetes.io/projected/6df16b93-f167-4cbe-a9da-e5fcb125d176-kube-api-access-spgm9\") pod \"maas-controller-5d6cb9fd64-fkv5k\" (UID: \"6df16b93-f167-4cbe-a9da-e5fcb125d176\") " pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:20.541401 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.541317 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:20.667107 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:20.667082 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5d6cb9fd64-fkv5k"] Apr 17 18:59:20.669966 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:59:20.669939 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df16b93_f167_4cbe_a9da_e5fcb125d176.slice/crio-9b76c2c90746cbf31762134734cd068a22fa9e88273b41e0ae31da28fadde8e5 WatchSource:0}: Error finding container 9b76c2c90746cbf31762134734cd068a22fa9e88273b41e0ae31da28fadde8e5: Status 404 returned error can't find the container with id 9b76c2c90746cbf31762134734cd068a22fa9e88273b41e0ae31da28fadde8e5 Apr 17 18:59:21.663093 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:21.663056 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" event={"ID":"6df16b93-f167-4cbe-a9da-e5fcb125d176","Type":"ContainerStarted","Data":"9b76c2c90746cbf31762134734cd068a22fa9e88273b41e0ae31da28fadde8e5"} Apr 17 18:59:23.671975 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:23.671937 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" event={"ID":"6df16b93-f167-4cbe-a9da-e5fcb125d176","Type":"ContainerStarted","Data":"b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f"} Apr 17 18:59:23.672365 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:23.672041 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:23.717229 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:23.717180 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" podStartSLOduration=1.479406011 podStartE2EDuration="3.717162214s" podCreationTimestamp="2026-04-17 18:59:20 +0000 UTC" firstStartedPulling="2026-04-17 18:59:20.672146592 +0000 UTC m=+604.444890989" lastFinishedPulling="2026-04-17 18:59:22.909902791 +0000 UTC m=+606.682647192" observedRunningTime="2026-04-17 18:59:23.714662333 +0000 UTC m=+607.487406741" watchObservedRunningTime="2026-04-17 18:59:23.717162214 +0000 UTC m=+607.489906624" Apr 17 18:59:34.680875 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:34.680844 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:34.963098 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:34.963020 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-tsl9p"] Apr 17 18:59:34.966458 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:34.966436 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 18:59:34.975567 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:34.975544 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-tsl9p"] Apr 17 18:59:35.053302 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:35.053265 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndggk\" (UniqueName: \"kubernetes.io/projected/11d105e6-01bb-48e2-9bc8-0c491896a0ba-kube-api-access-ndggk\") pod \"maas-controller-8589b4c7bc-tsl9p\" (UID: \"11d105e6-01bb-48e2-9bc8-0c491896a0ba\") " pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 18:59:35.154570 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:35.154488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndggk\" (UniqueName: \"kubernetes.io/projected/11d105e6-01bb-48e2-9bc8-0c491896a0ba-kube-api-access-ndggk\") pod \"maas-controller-8589b4c7bc-tsl9p\" (UID: \"11d105e6-01bb-48e2-9bc8-0c491896a0ba\") " pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 18:59:35.162851 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:35.162827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndggk\" (UniqueName: \"kubernetes.io/projected/11d105e6-01bb-48e2-9bc8-0c491896a0ba-kube-api-access-ndggk\") pod \"maas-controller-8589b4c7bc-tsl9p\" (UID: \"11d105e6-01bb-48e2-9bc8-0c491896a0ba\") " pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 18:59:35.277517 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:35.277441 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 18:59:35.401652 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:35.401590 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-tsl9p"] Apr 17 18:59:35.404293 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:59:35.404259 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11d105e6_01bb_48e2_9bc8_0c491896a0ba.slice/crio-0ec61d118cc4bf20ea9ed132fcc34df6a69d6c6debb28c8259d8d552c9f3e352 WatchSource:0}: Error finding container 0ec61d118cc4bf20ea9ed132fcc34df6a69d6c6debb28c8259d8d552c9f3e352: Status 404 returned error can't find the container with id 0ec61d118cc4bf20ea9ed132fcc34df6a69d6c6debb28c8259d8d552c9f3e352 Apr 17 18:59:35.712003 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:35.711968 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" event={"ID":"11d105e6-01bb-48e2-9bc8-0c491896a0ba","Type":"ContainerStarted","Data":"0ec61d118cc4bf20ea9ed132fcc34df6a69d6c6debb28c8259d8d552c9f3e352"} Apr 17 18:59:36.717141 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:36.717111 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" event={"ID":"11d105e6-01bb-48e2-9bc8-0c491896a0ba","Type":"ContainerStarted","Data":"93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e"} Apr 17 18:59:36.717569 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:36.717191 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 18:59:36.731209 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:36.731158 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" podStartSLOduration=2.3877115780000002 podStartE2EDuration="2.731126235s" podCreationTimestamp="2026-04-17 18:59:34 +0000 UTC" firstStartedPulling="2026-04-17 18:59:35.405493152 +0000 UTC m=+619.178237539" lastFinishedPulling="2026-04-17 18:59:35.74890781 +0000 UTC m=+619.521652196" observedRunningTime="2026-04-17 18:59:36.730617054 +0000 UTC m=+620.503361462" watchObservedRunningTime="2026-04-17 18:59:36.731126235 +0000 UTC m=+620.503870644" Apr 17 18:59:47.725969 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:47.725942 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 18:59:47.768605 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:47.768573 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5d6cb9fd64-fkv5k"] Apr 17 18:59:47.768828 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:47.768807 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" podUID="6df16b93-f167-4cbe-a9da-e5fcb125d176" containerName="manager" containerID="cri-o://b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f" gracePeriod=10 Apr 17 18:59:48.016670 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.016645 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:48.069772 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.069745 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spgm9\" (UniqueName: \"kubernetes.io/projected/6df16b93-f167-4cbe-a9da-e5fcb125d176-kube-api-access-spgm9\") pod \"6df16b93-f167-4cbe-a9da-e5fcb125d176\" (UID: \"6df16b93-f167-4cbe-a9da-e5fcb125d176\") " Apr 17 18:59:48.072053 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.072027 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df16b93-f167-4cbe-a9da-e5fcb125d176-kube-api-access-spgm9" (OuterVolumeSpecName: "kube-api-access-spgm9") pod "6df16b93-f167-4cbe-a9da-e5fcb125d176" (UID: "6df16b93-f167-4cbe-a9da-e5fcb125d176"). InnerVolumeSpecName "kube-api-access-spgm9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:59:48.170739 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.170694 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spgm9\" (UniqueName: \"kubernetes.io/projected/6df16b93-f167-4cbe-a9da-e5fcb125d176-kube-api-access-spgm9\") on node \"ip-10-0-141-118.ec2.internal\" DevicePath \"\"" Apr 17 18:59:48.756891 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.756854 2574 generic.go:358] "Generic (PLEG): container finished" podID="6df16b93-f167-4cbe-a9da-e5fcb125d176" containerID="b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f" exitCode=0 Apr 17 18:59:48.757296 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.756900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" event={"ID":"6df16b93-f167-4cbe-a9da-e5fcb125d176","Type":"ContainerDied","Data":"b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f"} Apr 17 18:59:48.757296 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.756915 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" Apr 17 18:59:48.757296 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.756925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d6cb9fd64-fkv5k" event={"ID":"6df16b93-f167-4cbe-a9da-e5fcb125d176","Type":"ContainerDied","Data":"9b76c2c90746cbf31762134734cd068a22fa9e88273b41e0ae31da28fadde8e5"} Apr 17 18:59:48.757296 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.756945 2574 scope.go:117] "RemoveContainer" containerID="b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f" Apr 17 18:59:48.765643 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.765619 2574 scope.go:117] "RemoveContainer" containerID="b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f" Apr 17 18:59:48.766006 ip-10-0-141-118 kubenswrapper[2574]: E0417 18:59:48.765980 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f\": container with ID starting with b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f not found: ID does not exist" containerID="b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f" Apr 17 18:59:48.766096 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.766018 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f"} err="failed to get container status \"b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f\": rpc error: code = NotFound desc = could not find container \"b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f\": container with ID starting with b5b6d1ab59cfac7965b21821b5ba4310e259eb38fe50607743e1616e41ec166f not found: ID does not exist" Apr 17 18:59:48.776378 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.776352 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5d6cb9fd64-fkv5k"] Apr 17 18:59:48.780453 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.780429 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5d6cb9fd64-fkv5k"] Apr 17 18:59:48.792238 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:48.792216 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df16b93-f167-4cbe-a9da-e5fcb125d176" path="/var/lib/kubelet/pods/6df16b93-f167-4cbe-a9da-e5fcb125d176/volumes" Apr 17 18:59:58.815150 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.815121 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp"] Apr 17 18:59:58.817444 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.815428 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6df16b93-f167-4cbe-a9da-e5fcb125d176" containerName="manager" Apr 17 18:59:58.817444 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.815438 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df16b93-f167-4cbe-a9da-e5fcb125d176" containerName="manager" Apr 17 18:59:58.817444 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.815498 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6df16b93-f167-4cbe-a9da-e5fcb125d176" containerName="manager" Apr 17 18:59:58.818340 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.818325 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:58.821575 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.821555 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 18:59:58.821681 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.821555 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 17 18:59:58.821681 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.821552 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-xc4r9\"" Apr 17 18:59:58.821945 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.821930 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 18:59:58.828604 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.828583 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp"] Apr 17 18:59:58.959994 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.959959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/776c8b0e-8511-4169-9bd2-ac817ec03a10-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:58.960161 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.960019 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:58.960161 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.960109 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:58.960161 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.960142 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:58.960267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.960166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:58.960267 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:58.960219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nklfr\" (UniqueName: \"kubernetes.io/projected/776c8b0e-8511-4169-9bd2-ac817ec03a10-kube-api-access-nklfr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061172 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nklfr\" (UniqueName: \"kubernetes.io/projected/776c8b0e-8511-4169-9bd2-ac817ec03a10-kube-api-access-nklfr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061342 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/776c8b0e-8511-4169-9bd2-ac817ec03a10-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061342 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061342 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061342 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061342 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061688 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061668 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061752 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061735 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.061823 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.061791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.063598 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.063579 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/776c8b0e-8511-4169-9bd2-ac817ec03a10-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.063910 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.063890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/776c8b0e-8511-4169-9bd2-ac817ec03a10-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.068282 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.068232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nklfr\" (UniqueName: \"kubernetes.io/projected/776c8b0e-8511-4169-9bd2-ac817ec03a10-kube-api-access-nklfr\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp\" (UID: \"776c8b0e-8511-4169-9bd2-ac817ec03a10\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.129162 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.129133 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 18:59:59.261439 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.261414 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp"] Apr 17 18:59:59.263606 ip-10-0-141-118 kubenswrapper[2574]: W0417 18:59:59.263577 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod776c8b0e_8511_4169_9bd2_ac817ec03a10.slice/crio-6d9e40b5583657d2c6ae76a99ea50d7f6599bc7dc7d73113bdd36ce91f3777d5 WatchSource:0}: Error finding container 6d9e40b5583657d2c6ae76a99ea50d7f6599bc7dc7d73113bdd36ce91f3777d5: Status 404 returned error can't find the container with id 6d9e40b5583657d2c6ae76a99ea50d7f6599bc7dc7d73113bdd36ce91f3777d5 Apr 17 18:59:59.803278 ip-10-0-141-118 kubenswrapper[2574]: I0417 18:59:59.803241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" event={"ID":"776c8b0e-8511-4169-9bd2-ac817ec03a10","Type":"ContainerStarted","Data":"6d9e40b5583657d2c6ae76a99ea50d7f6599bc7dc7d73113bdd36ce91f3777d5"} Apr 17 19:00:00.134053 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.134020 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607540-9lqdg"] Apr 17 19:00:00.137169 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.137152 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" Apr 17 19:00:00.139393 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.139368 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-8nwjw\"" Apr 17 19:00:00.142750 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.142728 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607540-9lqdg"] Apr 17 19:00:00.270565 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.270524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwmxf\" (UniqueName: \"kubernetes.io/projected/3272a0bd-5f05-4ef4-b772-69468bce1000-kube-api-access-qwmxf\") pod \"maas-api-key-cleanup-29607540-9lqdg\" (UID: \"3272a0bd-5f05-4ef4-b772-69468bce1000\") " pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" Apr 17 19:00:00.377146 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.373884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwmxf\" (UniqueName: \"kubernetes.io/projected/3272a0bd-5f05-4ef4-b772-69468bce1000-kube-api-access-qwmxf\") pod \"maas-api-key-cleanup-29607540-9lqdg\" (UID: \"3272a0bd-5f05-4ef4-b772-69468bce1000\") " pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" Apr 17 19:00:00.385144 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.385044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwmxf\" (UniqueName: \"kubernetes.io/projected/3272a0bd-5f05-4ef4-b772-69468bce1000-kube-api-access-qwmxf\") pod \"maas-api-key-cleanup-29607540-9lqdg\" (UID: \"3272a0bd-5f05-4ef4-b772-69468bce1000\") " pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" Apr 17 19:00:00.448719 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.448683 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" Apr 17 19:00:00.582550 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.582517 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607540-9lqdg"] Apr 17 19:00:00.585342 ip-10-0-141-118 kubenswrapper[2574]: W0417 19:00:00.585303 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3272a0bd_5f05_4ef4_b772_69468bce1000.slice/crio-487da4e7cde9a88fc3f03d8e7a491626f626ad0de5282747fde4d9e74fcfc979 WatchSource:0}: Error finding container 487da4e7cde9a88fc3f03d8e7a491626f626ad0de5282747fde4d9e74fcfc979: Status 404 returned error can't find the container with id 487da4e7cde9a88fc3f03d8e7a491626f626ad0de5282747fde4d9e74fcfc979 Apr 17 19:00:00.809038 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:00.808996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" event={"ID":"3272a0bd-5f05-4ef4-b772-69468bce1000","Type":"ContainerStarted","Data":"487da4e7cde9a88fc3f03d8e7a491626f626ad0de5282747fde4d9e74fcfc979"} Apr 17 19:00:02.817344 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:02.817309 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" event={"ID":"3272a0bd-5f05-4ef4-b772-69468bce1000","Type":"ContainerStarted","Data":"8c0403eeabe345bb6b766a0d511168359735d6b30433e5bd4071ae06cc724517"} Apr 17 19:00:02.831174 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:02.830940 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" podStartSLOduration=1.773527118 podStartE2EDuration="2.830924371s" podCreationTimestamp="2026-04-17 19:00:00 +0000 UTC" firstStartedPulling="2026-04-17 19:00:00.588210663 +0000 UTC m=+644.360955049" lastFinishedPulling="2026-04-17 19:00:01.645607904 +0000 UTC m=+645.418352302" observedRunningTime="2026-04-17 19:00:02.829992759 +0000 UTC m=+646.602737166" watchObservedRunningTime="2026-04-17 19:00:02.830924371 +0000 UTC m=+646.603668801" Apr 17 19:00:05.829580 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:05.829541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" event={"ID":"776c8b0e-8511-4169-9bd2-ac817ec03a10","Type":"ContainerStarted","Data":"48f63dad94c662e97c47c9c5a93a008b05992f46d8fc1e20f250ddc7d4be5240"} Apr 17 19:00:11.851367 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:11.851333 2574 generic.go:358] "Generic (PLEG): container finished" podID="776c8b0e-8511-4169-9bd2-ac817ec03a10" containerID="48f63dad94c662e97c47c9c5a93a008b05992f46d8fc1e20f250ddc7d4be5240" exitCode=0 Apr 17 19:00:11.851761 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:11.851409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" event={"ID":"776c8b0e-8511-4169-9bd2-ac817ec03a10","Type":"ContainerDied","Data":"48f63dad94c662e97c47c9c5a93a008b05992f46d8fc1e20f250ddc7d4be5240"} Apr 17 19:00:22.890535 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:22.890503 2574 generic.go:358] "Generic (PLEG): container finished" podID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerID="8c0403eeabe345bb6b766a0d511168359735d6b30433e5bd4071ae06cc724517" exitCode=6 Apr 17 19:00:22.890938 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:22.890584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" event={"ID":"3272a0bd-5f05-4ef4-b772-69468bce1000","Type":"ContainerDied","Data":"8c0403eeabe345bb6b766a0d511168359735d6b30433e5bd4071ae06cc724517"} Apr 17 19:00:22.890938 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:22.890935 2574 scope.go:117] "RemoveContainer" containerID="8c0403eeabe345bb6b766a0d511168359735d6b30433e5bd4071ae06cc724517" Apr 17 19:00:23.895650 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:23.895612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" event={"ID":"3272a0bd-5f05-4ef4-b772-69468bce1000","Type":"ContainerStarted","Data":"256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254"} Apr 17 19:00:26.907264 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:26.907174 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" event={"ID":"776c8b0e-8511-4169-9bd2-ac817ec03a10","Type":"ContainerStarted","Data":"c77c33093c037e95877bcc5ed5abde88f2837b9e35cda9cada0f90571ade173a"} Apr 17 19:00:26.907657 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:26.907397 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 19:00:26.924250 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:26.924197 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" podStartSLOduration=1.5482325700000001 podStartE2EDuration="28.924180626s" podCreationTimestamp="2026-04-17 18:59:58 +0000 UTC" firstStartedPulling="2026-04-17 18:59:59.265556463 +0000 UTC m=+643.038300852" lastFinishedPulling="2026-04-17 19:00:26.641504519 +0000 UTC m=+670.414248908" observedRunningTime="2026-04-17 19:00:26.92344618 +0000 UTC m=+670.696190612" watchObservedRunningTime="2026-04-17 19:00:26.924180626 +0000 UTC m=+670.696925036" Apr 17 19:00:36.015506 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.015467 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv"] Apr 17 19:00:36.019060 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.019039 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.021196 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.021173 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 17 19:00:36.028223 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.028201 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv"] Apr 17 19:00:36.099674 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.099642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.099882 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.099686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.099882 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.099737 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkgx4\" (UniqueName: \"kubernetes.io/projected/91214520-1a9b-4aa3-ba13-46ace22c41c3-kube-api-access-wkgx4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.099882 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.099761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91214520-1a9b-4aa3-ba13-46ace22c41c3-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.099882 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.099826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.099882 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.099863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201023 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.200983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkgx4\" (UniqueName: \"kubernetes.io/projected/91214520-1a9b-4aa3-ba13-46ace22c41c3-kube-api-access-wkgx4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201201 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.201039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91214520-1a9b-4aa3-ba13-46ace22c41c3-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201201 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.201133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201201 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.201180 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201385 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.201256 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201385 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.201301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201785 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.201678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201785 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.201698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.201785 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.201726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.203863 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.203832 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/91214520-1a9b-4aa3-ba13-46ace22c41c3-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.203991 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.203945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/91214520-1a9b-4aa3-ba13-46ace22c41c3-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.208991 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.208947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkgx4\" (UniqueName: \"kubernetes.io/projected/91214520-1a9b-4aa3-ba13-46ace22c41c3-kube-api-access-wkgx4\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv\" (UID: \"91214520-1a9b-4aa3-ba13-46ace22c41c3\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.329084 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.329051 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:36.462205 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.462182 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv"] Apr 17 19:00:36.464314 ip-10-0-141-118 kubenswrapper[2574]: W0417 19:00:36.464287 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91214520_1a9b_4aa3_ba13_46ace22c41c3.slice/crio-afd1d294f403ff103b93dd5db554dc9a0786c3b85d6c20ca10a2f5c3e419dc2b WatchSource:0}: Error finding container afd1d294f403ff103b93dd5db554dc9a0786c3b85d6c20ca10a2f5c3e419dc2b: Status 404 returned error can't find the container with id afd1d294f403ff103b93dd5db554dc9a0786c3b85d6c20ca10a2f5c3e419dc2b Apr 17 19:00:36.466202 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.466185 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 19:00:36.941420 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.941375 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" event={"ID":"91214520-1a9b-4aa3-ba13-46ace22c41c3","Type":"ContainerStarted","Data":"2e3542419adf870505bdcbadcaa97b883be935b284cc2f88fdf612f241355ff0"} Apr 17 19:00:36.941420 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:36.941422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" event={"ID":"91214520-1a9b-4aa3-ba13-46ace22c41c3","Type":"ContainerStarted","Data":"afd1d294f403ff103b93dd5db554dc9a0786c3b85d6c20ca10a2f5c3e419dc2b"} Apr 17 19:00:37.924103 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:37.924071 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp" Apr 17 19:00:43.985554 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:43.985470 2574 generic.go:358] "Generic (PLEG): container finished" podID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerID="256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254" exitCode=6 Apr 17 19:00:43.985554 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:43.985540 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" event={"ID":"3272a0bd-5f05-4ef4-b772-69468bce1000","Type":"ContainerDied","Data":"256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254"} Apr 17 19:00:43.986056 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:43.985584 2574 scope.go:117] "RemoveContainer" containerID="8c0403eeabe345bb6b766a0d511168359735d6b30433e5bd4071ae06cc724517" Apr 17 19:00:43.986056 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:43.985981 2574 scope.go:117] "RemoveContainer" containerID="256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254" Apr 17 19:00:43.986199 ip-10-0-141-118 kubenswrapper[2574]: E0417 19:00:43.986179 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607540-9lqdg_opendatahub(3272a0bd-5f05-4ef4-b772-69468bce1000)\"" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" Apr 17 19:00:44.308672 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.308635 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5"] Apr 17 19:00:44.313404 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.313386 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.315531 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.315510 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 19:00:44.319596 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.319572 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5"] Apr 17 19:00:44.479181 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.479135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.479402 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.479194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b077e520-fdc2-4e56-8010-1344dfb6f4e9-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.479402 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.479237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.479402 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.479280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.479402 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.479305 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.479402 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.479346 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglg7\" (UniqueName: \"kubernetes.io/projected/b077e520-fdc2-4e56-8010-1344dfb6f4e9-kube-api-access-pglg7\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.580333 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.580242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.580333 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.580279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b077e520-fdc2-4e56-8010-1344dfb6f4e9-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.580333 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.580309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.580333 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.580337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.580659 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.580355 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.580659 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.580378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pglg7\" (UniqueName: \"kubernetes.io/projected/b077e520-fdc2-4e56-8010-1344dfb6f4e9-kube-api-access-pglg7\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.580759 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.580700 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.581059 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.581036 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.581211 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.581070 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.584229 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.584199 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b077e520-fdc2-4e56-8010-1344dfb6f4e9-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.584490 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.584470 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b077e520-fdc2-4e56-8010-1344dfb6f4e9-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.588123 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.588091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglg7\" (UniqueName: \"kubernetes.io/projected/b077e520-fdc2-4e56-8010-1344dfb6f4e9-kube-api-access-pglg7\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5\" (UID: \"b077e520-fdc2-4e56-8010-1344dfb6f4e9\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.624604 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.624372 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:44.765367 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.765339 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5"] Apr 17 19:00:44.767605 ip-10-0-141-118 kubenswrapper[2574]: W0417 19:00:44.767581 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb077e520_fdc2_4e56_8010_1344dfb6f4e9.slice/crio-f569ee077677b5fa5eac9c19f9554ea239fbaf9b423106d249f252fe2cbf246d WatchSource:0}: Error finding container f569ee077677b5fa5eac9c19f9554ea239fbaf9b423106d249f252fe2cbf246d: Status 404 returned error can't find the container with id f569ee077677b5fa5eac9c19f9554ea239fbaf9b423106d249f252fe2cbf246d Apr 17 19:00:44.990670 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.990627 2574 generic.go:358] "Generic (PLEG): container finished" podID="91214520-1a9b-4aa3-ba13-46ace22c41c3" containerID="2e3542419adf870505bdcbadcaa97b883be935b284cc2f88fdf612f241355ff0" exitCode=0 Apr 17 19:00:44.991134 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.990701 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" event={"ID":"91214520-1a9b-4aa3-ba13-46ace22c41c3","Type":"ContainerDied","Data":"2e3542419adf870505bdcbadcaa97b883be935b284cc2f88fdf612f241355ff0"} Apr 17 19:00:44.993813 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.993784 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" event={"ID":"b077e520-fdc2-4e56-8010-1344dfb6f4e9","Type":"ContainerStarted","Data":"80615f7fd6537a487c0c3f5a18c7f0008a0ffc4dc5fe2ad093aecae3ba66032a"} Apr 17 19:00:44.993910 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:44.993819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" event={"ID":"b077e520-fdc2-4e56-8010-1344dfb6f4e9","Type":"ContainerStarted","Data":"f569ee077677b5fa5eac9c19f9554ea239fbaf9b423106d249f252fe2cbf246d"} Apr 17 19:00:45.998354 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:45.998322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" event={"ID":"91214520-1a9b-4aa3-ba13-46ace22c41c3","Type":"ContainerStarted","Data":"0f11d4545781b5b6fb269400b5ff2a1596cf1ee10fcebcb1556d705caf2f8bcc"} Apr 17 19:00:45.998827 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:45.998636 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:46.015567 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:46.015510 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" podStartSLOduration=10.717986597 podStartE2EDuration="11.015491592s" podCreationTimestamp="2026-04-17 19:00:35 +0000 UTC" firstStartedPulling="2026-04-17 19:00:44.991467291 +0000 UTC m=+688.764211684" lastFinishedPulling="2026-04-17 19:00:45.288972279 +0000 UTC m=+689.061716679" observedRunningTime="2026-04-17 19:00:46.014864739 +0000 UTC m=+689.787609142" watchObservedRunningTime="2026-04-17 19:00:46.015491592 +0000 UTC m=+689.788236003" Apr 17 19:00:51.017155 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:51.017120 2574 generic.go:358] "Generic (PLEG): container finished" podID="b077e520-fdc2-4e56-8010-1344dfb6f4e9" containerID="80615f7fd6537a487c0c3f5a18c7f0008a0ffc4dc5fe2ad093aecae3ba66032a" exitCode=0 Apr 17 19:00:51.017524 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:51.017194 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" event={"ID":"b077e520-fdc2-4e56-8010-1344dfb6f4e9","Type":"ContainerDied","Data":"80615f7fd6537a487c0c3f5a18c7f0008a0ffc4dc5fe2ad093aecae3ba66032a"} Apr 17 19:00:52.021844 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:52.021803 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" event={"ID":"b077e520-fdc2-4e56-8010-1344dfb6f4e9","Type":"ContainerStarted","Data":"54d1ad102e001d432af17325d29d55aaa34c94788b75d65ba1cd096604c07ba0"} Apr 17 19:00:52.022276 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:52.022039 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:00:52.040429 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:52.040373 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" podStartSLOduration=7.838575397 podStartE2EDuration="8.040357685s" podCreationTimestamp="2026-04-17 19:00:44 +0000 UTC" firstStartedPulling="2026-04-17 19:00:51.01781935 +0000 UTC m=+694.790563739" lastFinishedPulling="2026-04-17 19:00:51.219601638 +0000 UTC m=+694.992346027" observedRunningTime="2026-04-17 19:00:52.038714275 +0000 UTC m=+695.811458683" watchObservedRunningTime="2026-04-17 19:00:52.040357685 +0000 UTC m=+695.813102143" Apr 17 19:00:56.790107 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:56.790077 2574 scope.go:117] "RemoveContainer" containerID="256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254" Apr 17 19:00:57.015448 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:57.015423 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv" Apr 17 19:00:58.044674 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:58.044636 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" event={"ID":"3272a0bd-5f05-4ef4-b772-69468bce1000","Type":"ContainerStarted","Data":"1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2"} Apr 17 19:00:59.068901 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:59.068870 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607540-9lqdg"] Apr 17 19:00:59.069352 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:00:59.069093 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" containerID="cri-o://1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2" gracePeriod=30 Apr 17 19:01:03.038070 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:03.038041 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5" Apr 17 19:01:17.825839 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:17.825815 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" Apr 17 19:01:17.982979 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:17.982896 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwmxf\" (UniqueName: \"kubernetes.io/projected/3272a0bd-5f05-4ef4-b772-69468bce1000-kube-api-access-qwmxf\") pod \"3272a0bd-5f05-4ef4-b772-69468bce1000\" (UID: \"3272a0bd-5f05-4ef4-b772-69468bce1000\") " Apr 17 19:01:17.985125 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:17.985097 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3272a0bd-5f05-4ef4-b772-69468bce1000-kube-api-access-qwmxf" (OuterVolumeSpecName: "kube-api-access-qwmxf") pod "3272a0bd-5f05-4ef4-b772-69468bce1000" (UID: "3272a0bd-5f05-4ef4-b772-69468bce1000"). InnerVolumeSpecName "kube-api-access-qwmxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 19:01:18.084119 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.084082 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwmxf\" (UniqueName: \"kubernetes.io/projected/3272a0bd-5f05-4ef4-b772-69468bce1000-kube-api-access-qwmxf\") on node \"ip-10-0-141-118.ec2.internal\" DevicePath \"\"" Apr 17 19:01:18.114648 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.114616 2574 generic.go:358] "Generic (PLEG): container finished" podID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerID="1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2" exitCode=6 Apr 17 19:01:18.114825 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.114682 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" Apr 17 19:01:18.114825 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.114700 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" event={"ID":"3272a0bd-5f05-4ef4-b772-69468bce1000","Type":"ContainerDied","Data":"1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2"} Apr 17 19:01:18.114825 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.114751 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607540-9lqdg" event={"ID":"3272a0bd-5f05-4ef4-b772-69468bce1000","Type":"ContainerDied","Data":"487da4e7cde9a88fc3f03d8e7a491626f626ad0de5282747fde4d9e74fcfc979"} Apr 17 19:01:18.114825 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.114799 2574 scope.go:117] "RemoveContainer" containerID="1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2" Apr 17 19:01:18.123332 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.123317 2574 scope.go:117] "RemoveContainer" containerID="256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254" Apr 17 19:01:18.130324 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.130308 2574 scope.go:117] "RemoveContainer" containerID="1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2" Apr 17 19:01:18.130559 ip-10-0-141-118 kubenswrapper[2574]: E0417 19:01:18.130542 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2\": container with ID starting with 1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2 not found: ID does not exist" containerID="1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2" Apr 17 19:01:18.130618 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.130565 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2"} err="failed to get container status \"1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2\": rpc error: code = NotFound desc = could not find container \"1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2\": container with ID starting with 1005ea5f1591b45b04cd219254b096ca2c6747554674af60dc87fc6ec7d39ac2 not found: ID does not exist" Apr 17 19:01:18.130618 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.130580 2574 scope.go:117] "RemoveContainer" containerID="256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254" Apr 17 19:01:18.130839 ip-10-0-141-118 kubenswrapper[2574]: E0417 19:01:18.130821 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254\": container with ID starting with 256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254 not found: ID does not exist" containerID="256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254" Apr 17 19:01:18.130894 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.130847 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254"} err="failed to get container status \"256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254\": rpc error: code = NotFound desc = could not find container \"256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254\": container with ID starting with 256eb0c51a6deead60764c16421c39b17e05b328130a984e57cfb119285ea254 not found: ID does not exist" Apr 17 19:01:18.133435 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.133411 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607540-9lqdg"] Apr 17 19:01:18.136841 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.136817 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607540-9lqdg"] Apr 17 19:01:18.792473 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:01:18.792439 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" path="/var/lib/kubelet/pods/3272a0bd-5f05-4ef4-b772-69468bce1000/volumes" Apr 17 19:02:38.334711 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:38.334622 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-tsl9p"] Apr 17 19:02:38.335226 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:38.334963 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" podUID="11d105e6-01bb-48e2-9bc8-0c491896a0ba" containerName="manager" containerID="cri-o://93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e" gracePeriod=10 Apr 17 19:02:38.575811 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:38.575787 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 19:02:38.687941 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:38.687857 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndggk\" (UniqueName: \"kubernetes.io/projected/11d105e6-01bb-48e2-9bc8-0c491896a0ba-kube-api-access-ndggk\") pod \"11d105e6-01bb-48e2-9bc8-0c491896a0ba\" (UID: \"11d105e6-01bb-48e2-9bc8-0c491896a0ba\") " Apr 17 19:02:38.690027 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:38.689999 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11d105e6-01bb-48e2-9bc8-0c491896a0ba-kube-api-access-ndggk" (OuterVolumeSpecName: "kube-api-access-ndggk") pod "11d105e6-01bb-48e2-9bc8-0c491896a0ba" (UID: "11d105e6-01bb-48e2-9bc8-0c491896a0ba"). InnerVolumeSpecName "kube-api-access-ndggk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 19:02:38.788688 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:38.788656 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndggk\" (UniqueName: \"kubernetes.io/projected/11d105e6-01bb-48e2-9bc8-0c491896a0ba-kube-api-access-ndggk\") on node \"ip-10-0-141-118.ec2.internal\" DevicePath \"\"" Apr 17 19:02:39.393175 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.393135 2574 generic.go:358] "Generic (PLEG): container finished" podID="11d105e6-01bb-48e2-9bc8-0c491896a0ba" containerID="93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e" exitCode=0 Apr 17 19:02:39.393650 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.393204 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" Apr 17 19:02:39.393650 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.393229 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" event={"ID":"11d105e6-01bb-48e2-9bc8-0c491896a0ba","Type":"ContainerDied","Data":"93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e"} Apr 17 19:02:39.393650 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.393283 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8589b4c7bc-tsl9p" event={"ID":"11d105e6-01bb-48e2-9bc8-0c491896a0ba","Type":"ContainerDied","Data":"0ec61d118cc4bf20ea9ed132fcc34df6a69d6c6debb28c8259d8d552c9f3e352"} Apr 17 19:02:39.393650 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.393304 2574 scope.go:117] "RemoveContainer" containerID="93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e" Apr 17 19:02:39.401439 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.401421 2574 scope.go:117] "RemoveContainer" containerID="93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e" Apr 17 19:02:39.401677 ip-10-0-141-118 kubenswrapper[2574]: E0417 19:02:39.401655 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e\": container with ID starting with 93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e not found: ID does not exist" containerID="93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e" Apr 17 19:02:39.401724 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.401685 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e"} err="failed to get container status \"93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e\": rpc error: code = NotFound desc = could not find container \"93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e\": container with ID starting with 93301f0e9bfc5524aa899682048ae3070a85f1cb574ab04c57ee143341d1e59e not found: ID does not exist" Apr 17 19:02:39.407588 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.407566 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-tsl9p"] Apr 17 19:02:39.410307 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:39.410288 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-tsl9p"] Apr 17 19:02:40.140113 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140076 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-825m7"] Apr 17 19:02:40.140413 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140401 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:02:40.140455 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140415 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:02:40.140455 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140426 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11d105e6-01bb-48e2-9bc8-0c491896a0ba" containerName="manager" Apr 17 19:02:40.140455 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140431 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="11d105e6-01bb-48e2-9bc8-0c491896a0ba" containerName="manager" Apr 17 19:02:40.140455 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140444 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:02:40.140455 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140449 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:02:40.140599 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140509 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:02:40.140599 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140518 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:02:40.140599 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.140525 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="11d105e6-01bb-48e2-9bc8-0c491896a0ba" containerName="manager" Apr 17 19:02:40.144651 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.144636 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8589b4c7bc-825m7" Apr 17 19:02:40.146844 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.146826 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wjcpc\"" Apr 17 19:02:40.151414 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.151388 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-825m7"] Apr 17 19:02:40.201122 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.201091 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk8gx\" (UniqueName: \"kubernetes.io/projected/c3b1da31-f882-4663-9ceb-f31bb3679dd9-kube-api-access-vk8gx\") pod \"maas-controller-8589b4c7bc-825m7\" (UID: \"c3b1da31-f882-4663-9ceb-f31bb3679dd9\") " pod="opendatahub/maas-controller-8589b4c7bc-825m7" Apr 17 19:02:40.301653 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.301613 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk8gx\" (UniqueName: \"kubernetes.io/projected/c3b1da31-f882-4663-9ceb-f31bb3679dd9-kube-api-access-vk8gx\") pod \"maas-controller-8589b4c7bc-825m7\" (UID: \"c3b1da31-f882-4663-9ceb-f31bb3679dd9\") " pod="opendatahub/maas-controller-8589b4c7bc-825m7" Apr 17 19:02:40.309990 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.309955 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk8gx\" (UniqueName: \"kubernetes.io/projected/c3b1da31-f882-4663-9ceb-f31bb3679dd9-kube-api-access-vk8gx\") pod \"maas-controller-8589b4c7bc-825m7\" (UID: \"c3b1da31-f882-4663-9ceb-f31bb3679dd9\") " pod="opendatahub/maas-controller-8589b4c7bc-825m7" Apr 17 19:02:40.456083 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.455995 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8589b4c7bc-825m7" Apr 17 19:02:40.575683 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.575645 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8589b4c7bc-825m7"] Apr 17 19:02:40.577987 ip-10-0-141-118 kubenswrapper[2574]: W0417 19:02:40.577956 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3b1da31_f882_4663_9ceb_f31bb3679dd9.slice/crio-c16dadf5efc8e5d17220d8f596bc7870bb2ef12bbc1146838ce076423741d5ed WatchSource:0}: Error finding container c16dadf5efc8e5d17220d8f596bc7870bb2ef12bbc1146838ce076423741d5ed: Status 404 returned error can't find the container with id c16dadf5efc8e5d17220d8f596bc7870bb2ef12bbc1146838ce076423741d5ed Apr 17 19:02:40.793217 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:40.793138 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11d105e6-01bb-48e2-9bc8-0c491896a0ba" path="/var/lib/kubelet/pods/11d105e6-01bb-48e2-9bc8-0c491896a0ba/volumes" Apr 17 19:02:41.402482 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:41.402436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8589b4c7bc-825m7" event={"ID":"c3b1da31-f882-4663-9ceb-f31bb3679dd9","Type":"ContainerStarted","Data":"845262a253788c53227589bb67a8caa72c0c363490dfe87bd918f613d6795771"} Apr 17 19:02:41.402482 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:41.402473 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8589b4c7bc-825m7" event={"ID":"c3b1da31-f882-4663-9ceb-f31bb3679dd9","Type":"ContainerStarted","Data":"c16dadf5efc8e5d17220d8f596bc7870bb2ef12bbc1146838ce076423741d5ed"} Apr 17 19:02:41.402694 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:41.402570 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-8589b4c7bc-825m7" Apr 17 19:02:41.417183 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:41.417136 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-8589b4c7bc-825m7" podStartSLOduration=1.056888292 podStartE2EDuration="1.417123198s" podCreationTimestamp="2026-04-17 19:02:40 +0000 UTC" firstStartedPulling="2026-04-17 19:02:40.5792676 +0000 UTC m=+804.352011987" lastFinishedPulling="2026-04-17 19:02:40.939502498 +0000 UTC m=+804.712246893" observedRunningTime="2026-04-17 19:02:41.415822769 +0000 UTC m=+805.188567178" watchObservedRunningTime="2026-04-17 19:02:41.417123198 +0000 UTC m=+805.189867606" Apr 17 19:02:52.412399 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:02:52.412366 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-8589b4c7bc-825m7" Apr 17 19:04:16.715281 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:04:16.715256 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:04:16.717288 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:04:16.717264 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:09:16.740715 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:09:16.740690 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:09:16.742997 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:09:16.742976 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:12:52.896587 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:52.896551 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4"] Apr 17 19:12:52.898791 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:52.896811 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" podUID="e65c38fe-8e49-4137-aeef-7495a355fb4f" containerName="manager" containerID="cri-o://a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed" gracePeriod=10 Apr 17 19:12:53.148479 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.148425 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 19:12:53.237469 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.237442 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e65c38fe-8e49-4137-aeef-7495a355fb4f-extensions-socket-volume\") pod \"e65c38fe-8e49-4137-aeef-7495a355fb4f\" (UID: \"e65c38fe-8e49-4137-aeef-7495a355fb4f\") " Apr 17 19:12:53.237647 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.237524 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p2jd\" (UniqueName: \"kubernetes.io/projected/e65c38fe-8e49-4137-aeef-7495a355fb4f-kube-api-access-8p2jd\") pod \"e65c38fe-8e49-4137-aeef-7495a355fb4f\" (UID: \"e65c38fe-8e49-4137-aeef-7495a355fb4f\") " Apr 17 19:12:53.237892 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.237865 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e65c38fe-8e49-4137-aeef-7495a355fb4f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e65c38fe-8e49-4137-aeef-7495a355fb4f" (UID: "e65c38fe-8e49-4137-aeef-7495a355fb4f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 19:12:53.239694 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.239671 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e65c38fe-8e49-4137-aeef-7495a355fb4f-kube-api-access-8p2jd" (OuterVolumeSpecName: "kube-api-access-8p2jd") pod "e65c38fe-8e49-4137-aeef-7495a355fb4f" (UID: "e65c38fe-8e49-4137-aeef-7495a355fb4f"). InnerVolumeSpecName "kube-api-access-8p2jd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 19:12:53.338461 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.338426 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8p2jd\" (UniqueName: \"kubernetes.io/projected/e65c38fe-8e49-4137-aeef-7495a355fb4f-kube-api-access-8p2jd\") on node \"ip-10-0-141-118.ec2.internal\" DevicePath \"\"" Apr 17 19:12:53.338461 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.338456 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e65c38fe-8e49-4137-aeef-7495a355fb4f-extensions-socket-volume\") on node \"ip-10-0-141-118.ec2.internal\" DevicePath \"\"" Apr 17 19:12:53.505394 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.505301 2574 generic.go:358] "Generic (PLEG): container finished" podID="e65c38fe-8e49-4137-aeef-7495a355fb4f" containerID="a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed" exitCode=0 Apr 17 19:12:53.505394 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.505366 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" Apr 17 19:12:53.505610 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.505391 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" event={"ID":"e65c38fe-8e49-4137-aeef-7495a355fb4f","Type":"ContainerDied","Data":"a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed"} Apr 17 19:12:53.505610 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.505427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4" event={"ID":"e65c38fe-8e49-4137-aeef-7495a355fb4f","Type":"ContainerDied","Data":"0992a843f329bd952636061be5ec5e6b7808c77f7acb356eed38b4e8bea6f54c"} Apr 17 19:12:53.505610 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.505442 2574 scope.go:117] "RemoveContainer" containerID="a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed" Apr 17 19:12:53.514261 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.514246 2574 scope.go:117] "RemoveContainer" containerID="a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed" Apr 17 19:12:53.514487 ip-10-0-141-118 kubenswrapper[2574]: E0417 19:12:53.514467 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed\": container with ID starting with a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed not found: ID does not exist" containerID="a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed" Apr 17 19:12:53.514550 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.514499 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed"} err="failed to get container status \"a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed\": rpc error: code = NotFound desc = could not find container \"a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed\": container with ID starting with a2b9ceb4cf7556b443d2c9bad34109740554b873ea8ce4a3ad40de029a5a3aed not found: ID does not exist" Apr 17 19:12:53.524666 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.524643 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4"] Apr 17 19:12:53.527600 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:53.527577 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-55tv4"] Apr 17 19:12:54.792721 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:12:54.792687 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e65c38fe-8e49-4137-aeef-7495a355fb4f" path="/var/lib/kubelet/pods/e65c38fe-8e49-4137-aeef-7495a355fb4f/volumes" Apr 17 19:13:58.966756 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.966712 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg"] Apr 17 19:13:58.967260 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.967053 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:13:58.967260 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.967065 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:13:58.967260 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.967086 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e65c38fe-8e49-4137-aeef-7495a355fb4f" containerName="manager" Apr 17 19:13:58.967260 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.967092 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e65c38fe-8e49-4137-aeef-7495a355fb4f" containerName="manager" Apr 17 19:13:58.967260 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.967145 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e65c38fe-8e49-4137-aeef-7495a355fb4f" containerName="manager" Apr 17 19:13:58.967260 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.967153 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3272a0bd-5f05-4ef4-b772-69468bce1000" containerName="cleanup" Apr 17 19:13:58.969883 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.969867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:58.972302 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.972279 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-rrr25\"" Apr 17 19:13:58.980559 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:58.980536 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg"] Apr 17 19:13:59.099831 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.099788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af156c44-e625-4cb2-9a70-ab442ea17b74-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-txsbg\" (UID: \"af156c44-e625-4cb2-9a70-ab442ea17b74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:59.100002 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.099883 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prnw4\" (UniqueName: \"kubernetes.io/projected/af156c44-e625-4cb2-9a70-ab442ea17b74-kube-api-access-prnw4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-txsbg\" (UID: \"af156c44-e625-4cb2-9a70-ab442ea17b74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:59.201138 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.201098 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prnw4\" (UniqueName: \"kubernetes.io/projected/af156c44-e625-4cb2-9a70-ab442ea17b74-kube-api-access-prnw4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-txsbg\" (UID: \"af156c44-e625-4cb2-9a70-ab442ea17b74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:59.201311 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.201189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af156c44-e625-4cb2-9a70-ab442ea17b74-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-txsbg\" (UID: \"af156c44-e625-4cb2-9a70-ab442ea17b74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:59.201592 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.201574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/af156c44-e625-4cb2-9a70-ab442ea17b74-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-txsbg\" (UID: \"af156c44-e625-4cb2-9a70-ab442ea17b74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:59.208870 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.208841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prnw4\" (UniqueName: \"kubernetes.io/projected/af156c44-e625-4cb2-9a70-ab442ea17b74-kube-api-access-prnw4\") pod \"kuadrant-operator-controller-manager-55c7f4c975-txsbg\" (UID: \"af156c44-e625-4cb2-9a70-ab442ea17b74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:59.280583 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.280504 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:59.406305 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.406277 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg"] Apr 17 19:13:59.409334 ip-10-0-141-118 kubenswrapper[2574]: W0417 19:13:59.409309 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf156c44_e625_4cb2_9a70_ab442ea17b74.slice/crio-a684173e35c5385ced8824be1d84014c7d6e5d8647734212cd0784b59bb3c72b WatchSource:0}: Error finding container a684173e35c5385ced8824be1d84014c7d6e5d8647734212cd0784b59bb3c72b: Status 404 returned error can't find the container with id a684173e35c5385ced8824be1d84014c7d6e5d8647734212cd0784b59bb3c72b Apr 17 19:13:59.411670 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.411652 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 19:13:59.740605 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.740565 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" event={"ID":"af156c44-e625-4cb2-9a70-ab442ea17b74","Type":"ContainerStarted","Data":"86029dfbfe20544c0d8edcaa82efe6b7522b5cbe0284f0b92a4e0305e46fbd6f"} Apr 17 19:13:59.740605 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.740605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" event={"ID":"af156c44-e625-4cb2-9a70-ab442ea17b74","Type":"ContainerStarted","Data":"a684173e35c5385ced8824be1d84014c7d6e5d8647734212cd0784b59bb3c72b"} Apr 17 19:13:59.740850 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.740683 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:13:59.763637 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:13:59.763589 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" podStartSLOduration=1.763575167 podStartE2EDuration="1.763575167s" podCreationTimestamp="2026-04-17 19:13:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 19:13:59.76119119 +0000 UTC m=+1483.533935598" watchObservedRunningTime="2026-04-17 19:13:59.763575167 +0000 UTC m=+1483.536319574" Apr 17 19:14:10.745718 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:14:10.745686 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-txsbg" Apr 17 19:14:16.763072 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:14:16.763043 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:14:16.765360 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:14:16.765340 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:19:16.788081 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:19:16.788050 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:19:16.794623 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:19:16.794599 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:23:40.364275 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:40.364197 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-kbp4l_ff0265b3-8593-4d8b-9da4-d3f26de60afc/manager/0.log" Apr 17 19:23:40.613079 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:40.613040 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-8589b4c7bc-825m7_c3b1da31-f882-4663-9ceb-f31bb3679dd9/manager/0.log" Apr 17 19:23:40.724687 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:40.724600 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-x8dsg_2b98dbea-d54e-49c7-9b26-6e037e8fe470/manager/2.log" Apr 17 19:23:40.948881 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:40.948842 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6fc6488c9d-jdmcg_21834787-d3c0-4a75-b176-8640876eb579/manager/0.log" Apr 17 19:23:42.986964 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:42.986939 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-txsbg_af156c44-e625-4cb2-9a70-ab442ea17b74/manager/0.log" Apr 17 19:23:44.257546 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:44.257513 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-kfc2v_6fc9c869-38cf-4930-bfe8-52531b4ea284/manager/0.log" Apr 17 19:23:44.697725 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:44.697691 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-h9dbj_aec37621-49b9-45c4-b246-325387f61042/discovery/0.log" Apr 17 19:23:44.804006 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:44.803974 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-59447f86f4-6dz8x_a4c43ec7-2e95-4a42-97b3-88721e3ac0ab/kube-auth-proxy/0.log" Apr 17 19:23:45.586882 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:45.586856 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp_776c8b0e-8511-4169-9bd2-ac817ec03a10/storage-initializer/0.log" Apr 17 19:23:45.593976 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:45.593954 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-27zlp_776c8b0e-8511-4169-9bd2-ac817ec03a10/main/0.log" Apr 17 19:23:45.931867 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:45.931761 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5_b077e520-fdc2-4e56-8010-1344dfb6f4e9/storage-initializer/0.log" Apr 17 19:23:45.940656 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:45.940634 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-gf2l5_b077e520-fdc2-4e56-8010-1344dfb6f4e9/main/0.log" Apr 17 19:23:46.049476 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:46.049444 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv_91214520-1a9b-4aa3-ba13-46ace22c41c3/storage-initializer/0.log" Apr 17 19:23:46.055909 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:46.055888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-jn6sv_91214520-1a9b-4aa3-ba13-46ace22c41c3/main/0.log" Apr 17 19:23:57.569131 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:57.569092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bq86n_4cde13d3-2f3b-4ae9-b90e-00369cefc3cf/global-pull-secret-syncer/0.log" Apr 17 19:23:57.677237 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:57.677211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-j7fsm_7681ddb2-af4c-468a-93b5-9e7d47992b0f/konnectivity-agent/0.log" Apr 17 19:23:57.768962 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:23:57.768934 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-118.ec2.internal_de49da82ba182640048bc9ceae0a365f/haproxy/0.log" Apr 17 19:24:02.819407 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:02.819349 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-txsbg_af156c44-e625-4cb2-9a70-ab442ea17b74/manager/0.log" Apr 17 19:24:02.909945 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:02.909902 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-kfc2v_6fc9c869-38cf-4930-bfe8-52531b4ea284/manager/0.log" Apr 17 19:24:04.594931 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:04.594895 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7dc5f5d68-qvmvl_92661bb5-d795-4fae-b174-a3151e1c7420/metrics-server/0.log" Apr 17 19:24:04.784565 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:04.784536 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zlk7p_744be0a5-8d64-4004-ab9b-a120080a13b5/node-exporter/0.log" Apr 17 19:24:04.803977 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:04.803944 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zlk7p_744be0a5-8d64-4004-ab9b-a120080a13b5/kube-rbac-proxy/0.log" Apr 17 19:24:04.823000 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:04.822977 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zlk7p_744be0a5-8d64-4004-ab9b-a120080a13b5/init-textfile/0.log" Apr 17 19:24:05.082344 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:05.082300 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hld44_2261af19-5997-4854-9384-97c64c2d7dc4/prometheus-operator/0.log" Apr 17 19:24:05.101992 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:05.101968 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hld44_2261af19-5997-4854-9384-97c64c2d7dc4/kube-rbac-proxy/0.log" Apr 17 19:24:06.156249 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.156212 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs"] Apr 17 19:24:06.159799 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.159753 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.161929 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.161906 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-558qr\"/\"openshift-service-ca.crt\"" Apr 17 19:24:06.162757 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.162742 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-558qr\"/\"default-dockercfg-6t7f8\"" Apr 17 19:24:06.162828 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.162795 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-558qr\"/\"kube-root-ca.crt\"" Apr 17 19:24:06.165832 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.165796 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs"] Apr 17 19:24:06.183034 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.183008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-proc\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.183186 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.183041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-lib-modules\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.183186 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.183061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-podres\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.183273 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.183174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9lk6\" (UniqueName: \"kubernetes.io/projected/54b1dccb-6a41-43e8-9761-4dcef524bb8b-kube-api-access-d9lk6\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.183273 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.183215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-sys\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284359 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-lib-modules\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284359 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-podres\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284596 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9lk6\" (UniqueName: \"kubernetes.io/projected/54b1dccb-6a41-43e8-9761-4dcef524bb8b-kube-api-access-d9lk6\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284596 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284420 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-sys\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284596 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-proc\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284596 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-podres\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284596 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284535 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-lib-modules\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284596 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284555 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-sys\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.284596 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.284588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54b1dccb-6a41-43e8-9761-4dcef524bb8b-proc\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.292470 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.292445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9lk6\" (UniqueName: \"kubernetes.io/projected/54b1dccb-6a41-43e8-9761-4dcef524bb8b-kube-api-access-d9lk6\") pod \"perf-node-gather-daemonset-blhxs\" (UID: \"54b1dccb-6a41-43e8-9761-4dcef524bb8b\") " pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.470796 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.470688 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.597132 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.597090 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs"] Apr 17 19:24:06.599461 ip-10-0-141-118 kubenswrapper[2574]: W0417 19:24:06.599431 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod54b1dccb_6a41_43e8_9761_4dcef524bb8b.slice/crio-4053d79eeb74f870e2cd5f79937ede996f8728a9a7a1b6fe402c11aa5e8a2bf1 WatchSource:0}: Error finding container 4053d79eeb74f870e2cd5f79937ede996f8728a9a7a1b6fe402c11aa5e8a2bf1: Status 404 returned error can't find the container with id 4053d79eeb74f870e2cd5f79937ede996f8728a9a7a1b6fe402c11aa5e8a2bf1 Apr 17 19:24:06.601183 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.601168 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 19:24:06.825719 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.825684 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" event={"ID":"54b1dccb-6a41-43e8-9761-4dcef524bb8b","Type":"ContainerStarted","Data":"c874cd9ec455523f46a7a2557ac3e9c66ff08845ca9eee47b04368544c96067e"} Apr 17 19:24:06.825719 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.825716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" event={"ID":"54b1dccb-6a41-43e8-9761-4dcef524bb8b","Type":"ContainerStarted","Data":"4053d79eeb74f870e2cd5f79937ede996f8728a9a7a1b6fe402c11aa5e8a2bf1"} Apr 17 19:24:06.825995 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.825818 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:06.843049 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:06.843001 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" podStartSLOduration=0.842987172 podStartE2EDuration="842.987172ms" podCreationTimestamp="2026-04-17 19:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 19:24:06.841139149 +0000 UTC m=+2090.613883557" watchObservedRunningTime="2026-04-17 19:24:06.842987172 +0000 UTC m=+2090.615731625" Apr 17 19:24:07.279791 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:07.279695 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-qwbn9_585034e5-3c8d-42a7-9bd7-7f42418e3163/download-server/0.log" Apr 17 19:24:08.555698 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:08.555669 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-r7tlb_3853ba23-4818-4e5c-adb0-a74c55faa515/dns/0.log" Apr 17 19:24:08.574512 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:08.574486 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-r7tlb_3853ba23-4818-4e5c-adb0-a74c55faa515/kube-rbac-proxy/0.log" Apr 17 19:24:08.597391 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:08.597364 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fzlng_19201734-1263-46e9-b401-4768c56c505c/dns-node-resolver/0.log" Apr 17 19:24:09.050717 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:09.050685 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-f445ccfd9-bm6zt_638e9ce4-6adf-4245-a9d2-47e2df3045e6/registry/0.log" Apr 17 19:24:09.068703 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:09.068674 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n7gs7_075172fd-6f0b-45b8-8765-5b6397bdb2b8/node-ca/0.log" Apr 17 19:24:09.935899 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:09.935868 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-h9dbj_aec37621-49b9-45c4-b246-325387f61042/discovery/0.log" Apr 17 19:24:09.958865 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:09.958836 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-59447f86f4-6dz8x_a4c43ec7-2e95-4a42-97b3-88721e3ac0ab/kube-auth-proxy/0.log" Apr 17 19:24:10.591311 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:10.591280 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-q6kpz_02bba305-18ac-410d-ab1e-0abfaf32082a/serve-healthcheck-canary/0.log" Apr 17 19:24:11.082373 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:11.082350 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-66qdh_b8fdc99b-b5cd-41d2-b3ec-c73459772491/kube-rbac-proxy/0.log" Apr 17 19:24:11.102849 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:11.102823 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-66qdh_b8fdc99b-b5cd-41d2-b3ec-c73459772491/exporter/0.log" Apr 17 19:24:11.126341 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:11.126314 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-66qdh_b8fdc99b-b5cd-41d2-b3ec-c73459772491/extractor/0.log" Apr 17 19:24:12.839350 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:12.839324 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-558qr/perf-node-gather-daemonset-blhxs" Apr 17 19:24:12.947953 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:12.947908 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-kbp4l_ff0265b3-8593-4d8b-9da4-d3f26de60afc/manager/0.log" Apr 17 19:24:13.041325 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:13.041277 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-8589b4c7bc-825m7_c3b1da31-f882-4663-9ceb-f31bb3679dd9/manager/0.log" Apr 17 19:24:13.059340 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:13.059308 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-x8dsg_2b98dbea-d54e-49c7-9b26-6e037e8fe470/manager/1.log" Apr 17 19:24:13.070331 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:13.070302 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-x8dsg_2b98dbea-d54e-49c7-9b26-6e037e8fe470/manager/2.log" Apr 17 19:24:13.133856 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:13.133756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6fc6488c9d-jdmcg_21834787-d3c0-4a75-b176-8640876eb579/manager/0.log" Apr 17 19:24:16.814866 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:16.814835 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:24:16.818385 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:16.818365 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:24:18.965974 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:18.965943 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-n5kw8_73d6f00a-e262-4e05-a576-fa1aa63bd8a8/kube-storage-version-migrator-operator/1.log" Apr 17 19:24:18.966934 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:18.966912 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-n5kw8_73d6f00a-e262-4e05-a576-fa1aa63bd8a8/kube-storage-version-migrator-operator/0.log" Apr 17 19:24:20.100146 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.100118 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4xm2n_b985cdb5-8694-44dd-aa5f-2770ef10e3c4/kube-multus-additional-cni-plugins/0.log" Apr 17 19:24:20.124302 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.124273 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4xm2n_b985cdb5-8694-44dd-aa5f-2770ef10e3c4/egress-router-binary-copy/0.log" Apr 17 19:24:20.143668 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.143637 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4xm2n_b985cdb5-8694-44dd-aa5f-2770ef10e3c4/cni-plugins/0.log" Apr 17 19:24:20.163423 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.163394 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4xm2n_b985cdb5-8694-44dd-aa5f-2770ef10e3c4/bond-cni-plugin/0.log" Apr 17 19:24:20.184099 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.184074 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4xm2n_b985cdb5-8694-44dd-aa5f-2770ef10e3c4/routeoverride-cni/0.log" Apr 17 19:24:20.209337 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.209305 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4xm2n_b985cdb5-8694-44dd-aa5f-2770ef10e3c4/whereabouts-cni-bincopy/0.log" Apr 17 19:24:20.229002 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.228974 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4xm2n_b985cdb5-8694-44dd-aa5f-2770ef10e3c4/whereabouts-cni/0.log" Apr 17 19:24:20.459317 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.459248 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xc74x_6e2d8622-7004-4d8f-9297-ccace6582a00/kube-multus/0.log" Apr 17 19:24:20.520436 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.520408 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dpqmj_9e29f722-7b28-401a-9488-46ff42062854/network-metrics-daemon/0.log" Apr 17 19:24:20.538503 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:20.538476 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dpqmj_9e29f722-7b28-401a-9488-46ff42062854/kube-rbac-proxy/0.log" Apr 17 19:24:21.612512 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.612480 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-controller/0.log" Apr 17 19:24:21.630162 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.630137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/0.log" Apr 17 19:24:21.640269 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.640242 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovn-acl-logging/1.log" Apr 17 19:24:21.657287 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.657256 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/kube-rbac-proxy-node/0.log" Apr 17 19:24:21.678534 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.678504 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 19:24:21.699816 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.699792 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/northd/0.log" Apr 17 19:24:21.719908 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.719852 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/nbdb/0.log" Apr 17 19:24:21.740386 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.740356 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/sbdb/0.log" Apr 17 19:24:21.836865 ip-10-0-141-118 kubenswrapper[2574]: I0417 19:24:21.836833 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rj69g_91638f07-e924-407f-bb78-79ea02748faa/ovnkube-controller/0.log"