Apr 16 23:25:39.265284 ip-10-0-136-147 systemd[1]: Starting Kubernetes Kubelet... Apr 16 23:25:39.773800 ip-10-0-136-147 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:25:39.773800 ip-10-0-136-147 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 23:25:39.773800 ip-10-0-136-147 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:25:39.773800 ip-10-0-136-147 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 23:25:39.773800 ip-10-0-136-147 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:25:39.775587 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.775494 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 23:25:39.777976 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.777945 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:39.777976 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.777975 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:39.777976 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.777981 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.777985 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.777990 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.777994 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.777998 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778002 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778006 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778010 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778014 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778019 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778023 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778026 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778030 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778035 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778039 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778043 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778047 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778051 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778055 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778059 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:39.778176 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778063 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778067 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778070 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778074 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778079 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778083 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778087 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778090 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778095 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778101 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778108 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778112 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778116 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778120 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778125 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778130 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778134 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778138 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778142 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:39.778875 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778146 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778153 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778157 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778162 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778166 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778170 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778174 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778179 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778183 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778187 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778191 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778195 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778199 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778203 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778208 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778213 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778217 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778221 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778225 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:39.779401 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778230 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778234 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778238 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778242 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778246 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778251 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778255 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778259 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778265 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778269 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778273 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778277 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778288 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778293 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778297 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778301 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778305 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778309 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778313 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778318 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:39.779891 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778323 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:39.780414 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778332 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:39.780414 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778339 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:39.780414 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778344 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:39.780414 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778348 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:39.780414 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.778352 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:39.781089 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781075 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:39.781089 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781089 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781094 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781098 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781103 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781106 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781111 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781115 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781119 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781123 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781128 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781132 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781136 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781140 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781144 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781148 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781152 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781156 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781160 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781164 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:39.781206 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781170 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781178 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781184 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781191 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781196 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781200 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781205 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781210 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781215 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781219 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781223 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781229 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781233 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781238 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781242 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781246 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781250 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781254 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781258 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:39.782025 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781262 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781267 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781270 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781274 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781278 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781282 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781287 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781291 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781294 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781298 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781302 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781306 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781311 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781315 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781319 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781326 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781331 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781335 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781339 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781344 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:39.782869 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781348 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781353 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781357 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781361 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781365 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781369 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781394 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781401 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781406 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781411 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781416 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781420 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781424 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781428 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781432 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781436 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781440 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781445 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781449 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781453 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:39.783459 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781457 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781461 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781465 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781469 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781473 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781477 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.781481 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781578 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781589 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781608 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781621 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781627 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781633 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781639 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781647 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781653 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781658 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781664 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781669 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781674 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781679 2569 flags.go:64] FLAG: --cgroup-root="" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781684 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781689 2569 flags.go:64] FLAG: --client-ca-file="" Apr 16 23:25:39.784019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781694 2569 flags.go:64] FLAG: --cloud-config="" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781698 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781703 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781709 2569 flags.go:64] FLAG: --cluster-domain="" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781714 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781719 2569 flags.go:64] FLAG: --config-dir="" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781724 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781729 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781736 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781741 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781746 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781751 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781756 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781761 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781766 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781771 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781776 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781782 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781787 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781792 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781797 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781802 2569 flags.go:64] FLAG: --enable-server="true" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781806 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781814 2569 flags.go:64] FLAG: --event-burst="100" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781820 2569 flags.go:64] FLAG: --event-qps="50" Apr 16 23:25:39.784877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781825 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781831 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781836 2569 flags.go:64] FLAG: --eviction-hard="" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781842 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781846 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781852 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781857 2569 flags.go:64] FLAG: --eviction-soft="" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781862 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781867 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781871 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781876 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781881 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781886 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781890 2569 flags.go:64] FLAG: --feature-gates="" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781897 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781902 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781907 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781913 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781918 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781923 2569 flags.go:64] FLAG: --help="false" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781927 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-136-147.ec2.internal" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781933 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781938 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 23:25:39.785665 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781943 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781949 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781969 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781975 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781979 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781984 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781989 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781993 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.781999 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782005 2569 flags.go:64] FLAG: --kube-reserved="" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782010 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782015 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782020 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782024 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782029 2569 flags.go:64] FLAG: --lock-file="" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782034 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782039 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782044 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782057 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782063 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782067 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782072 2569 flags.go:64] FLAG: --logging-format="text" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782077 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782083 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 23:25:39.786267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782088 2569 flags.go:64] FLAG: --manifest-url="" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782092 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782100 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782105 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782111 2569 flags.go:64] FLAG: --max-pods="110" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782116 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782121 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782126 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782134 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782140 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782144 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782149 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782161 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782166 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782171 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782176 2569 flags.go:64] FLAG: --pod-cidr="" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782180 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782190 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782196 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782201 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782206 2569 flags.go:64] FLAG: --port="10250" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782211 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782215 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03def40b0261c2767" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782221 2569 flags.go:64] FLAG: --qos-reserved="" Apr 16 23:25:39.786855 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782227 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782232 2569 flags.go:64] FLAG: --register-node="true" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782237 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782242 2569 flags.go:64] FLAG: --register-with-taints="" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782248 2569 flags.go:64] FLAG: --registry-burst="10" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782253 2569 flags.go:64] FLAG: --registry-qps="5" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782258 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782263 2569 flags.go:64] FLAG: --reserved-memory="" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782269 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782275 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782288 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782294 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782299 2569 flags.go:64] FLAG: --runonce="false" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782304 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782309 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782313 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782318 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782324 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782330 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782335 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782340 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782345 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782349 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782354 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782359 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782364 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 23:25:39.787451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782369 2569 flags.go:64] FLAG: --system-cgroups="" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782375 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782384 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782388 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782393 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782400 2569 flags.go:64] FLAG: --tls-min-version="" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782405 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782410 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782415 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782420 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782425 2569 flags.go:64] FLAG: --v="2" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782432 2569 flags.go:64] FLAG: --version="false" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782438 2569 flags.go:64] FLAG: --vmodule="" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782445 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.782450 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782624 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782632 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782636 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782643 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782650 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782655 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782659 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:39.788099 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782664 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782671 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782675 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782679 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782683 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782687 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782691 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782695 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782699 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782704 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782708 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782713 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782718 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782723 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782727 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782731 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782735 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782739 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782743 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782747 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:39.788730 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782751 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782756 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782760 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782764 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782768 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782772 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782777 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782781 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782785 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782789 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782793 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782797 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782801 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782807 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782811 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782815 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782820 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782824 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782828 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782832 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:39.789271 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782836 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782840 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782844 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782849 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782853 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782858 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782862 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782866 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782870 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782875 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782879 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782883 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782887 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782891 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782895 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782900 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782904 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782909 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782913 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782917 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:39.789785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782921 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782925 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782929 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782933 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782937 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782945 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782951 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782975 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782981 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782985 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782989 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782993 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.782997 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.783002 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.783006 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.783010 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.783015 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.783019 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:39.790295 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.783025 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:39.790749 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.783824 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:25:39.792638 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.792618 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 23:25:39.792670 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.792639 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 23:25:39.792703 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792699 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792705 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792708 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792712 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792715 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792718 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792721 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792724 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792728 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792732 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792736 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:39.792735 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792739 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792742 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792744 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792747 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792750 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792753 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792755 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792758 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792760 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792763 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792765 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792768 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792771 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792773 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792776 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792779 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792781 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792783 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792786 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792789 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:39.793028 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792791 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792794 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792798 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792802 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792805 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792808 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792811 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792814 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792816 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792819 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792822 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792824 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792827 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792829 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792833 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792836 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792838 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792841 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792844 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:39.793514 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792847 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792850 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792852 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792855 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792858 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792860 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792863 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792866 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792868 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792871 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792874 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792876 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792879 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792881 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792884 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792886 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792889 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792892 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792894 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792896 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:39.794011 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792899 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792902 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792904 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792907 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792909 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792912 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792914 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792918 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792921 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792923 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792926 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792929 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792931 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792934 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792937 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.792939 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:39.794501 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.792944 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793055 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793060 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793063 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793066 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793069 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793072 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793074 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793077 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793080 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793082 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793085 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793087 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793090 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793093 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793095 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793098 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793101 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793103 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793106 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:25:39.794889 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793108 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793111 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793114 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793117 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793120 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793123 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793126 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793128 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793131 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793134 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793136 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793139 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793142 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793144 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793147 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793149 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793152 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793154 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793157 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793159 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:25:39.795397 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793162 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793164 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793167 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793170 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793172 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793174 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793177 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793179 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793182 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793184 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793188 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793190 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793193 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793196 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793199 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793202 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793204 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793207 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793209 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793212 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:25:39.795868 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793215 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793218 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793220 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793223 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793225 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793228 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793231 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793233 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793236 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793238 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793242 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793246 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793249 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793252 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793255 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793259 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793262 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793265 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793268 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:25:39.796358 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793271 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793274 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793277 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793279 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793283 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793285 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793288 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:39.793291 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.793296 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.794215 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 23:25:39.796816 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.796534 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 23:25:39.797631 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.797619 2569 server.go:1019] "Starting client certificate rotation" Apr 16 23:25:39.797736 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.797719 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:25:39.797767 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.797760 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:25:39.826018 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.826000 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:25:39.829712 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.829695 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:25:39.847405 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.847384 2569 log.go:25] "Validated CRI v1 runtime API" Apr 16 23:25:39.854010 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.853990 2569 log.go:25] "Validated CRI v1 image API" Apr 16 23:25:39.856197 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.856183 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 23:25:39.859032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.859005 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b3800ac4-028a-46a6-b9c1-8671757eef81:/dev/nvme0n1p3 fc0245de-4f4c-4432-855f-d2319456cc27:/dev/nvme0n1p4] Apr 16 23:25:39.859099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.859029 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 23:25:39.859099 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.859089 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:25:39.864920 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.864818 2569 manager.go:217] Machine: {Timestamp:2026-04-16 23:25:39.862626007 +0000 UTC m=+0.464420203 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098465 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec298c0ea93a62fb524e1f27a25173b4 SystemUUID:ec298c0e-a93a-62fb-524e-1f27a25173b4 BootID:9902103c-f9ff-425d-b40f-a30b325bb064 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c5:4c:62:b9:55 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c5:4c:62:b9:55 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:78:8b:9b:05:0e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 23:25:39.864920 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.864915 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 23:25:39.865037 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.865006 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 23:25:39.867092 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.867067 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 23:25:39.867271 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.867095 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-147.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 23:25:39.867311 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.867291 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 23:25:39.867311 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.867299 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 23:25:39.867365 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.867312 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:25:39.868247 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.868237 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:25:39.870327 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.870315 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:25:39.870430 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.870421 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 23:25:39.873067 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.873057 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 16 23:25:39.873103 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.873076 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 23:25:39.873103 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.873089 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 23:25:39.873103 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.873098 2569 kubelet.go:397] "Adding apiserver pod source" Apr 16 23:25:39.873190 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.873107 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 23:25:39.874225 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.874214 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:25:39.874263 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.874232 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:25:39.877468 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.877452 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 23:25:39.879166 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.879154 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 23:25:39.880863 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880833 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 23:25:39.880863 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880861 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880872 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880882 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880891 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880899 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880908 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880917 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880928 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880938 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880977 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 23:25:39.881073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.880991 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 23:25:39.882064 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.882053 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 23:25:39.882064 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.882065 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 23:25:39.886030 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.886016 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 23:25:39.886136 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.886051 2569 server.go:1295] "Started kubelet" Apr 16 23:25:39.886193 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.886153 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 23:25:39.886244 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.886181 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 23:25:39.886311 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.886249 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 23:25:39.886712 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.886684 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-147.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 23:25:39.886712 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.886705 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-147.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 23:25:39.886834 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.886758 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 23:25:39.886878 ip-10-0-136-147 systemd[1]: Started Kubernetes Kubelet. Apr 16 23:25:39.887762 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.887672 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 23:25:39.889260 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.889243 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 16 23:25:39.896737 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.896638 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 23:25:39.896737 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.896645 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 23:25:39.897418 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.897394 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 23:25:39.899256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.898826 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:39.899256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.898901 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 23:25:39.899256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.898920 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 23:25:39.899256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.899118 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 16 23:25:39.899256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.899129 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 16 23:25:39.899628 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.896665 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-147.ec2.internal.18a6f9f670537571 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-147.ec2.internal,UID:ip-10-0-136-147.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-147.ec2.internal,},FirstTimestamp:2026-04-16 23:25:39.886028145 +0000 UTC m=+0.487822342,LastTimestamp:2026-04-16 23:25:39.886028145 +0000 UTC m=+0.487822342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-147.ec2.internal,}" Apr 16 23:25:39.899628 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.899583 2569 factory.go:55] Registering systemd factory Apr 16 23:25:39.899628 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.899605 2569 factory.go:223] Registration of the systemd container factory successfully Apr 16 23:25:39.899988 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.899838 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 23:25:39.899988 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.899971 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-147.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 23:25:39.900308 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.900292 2569 factory.go:153] Registering CRI-O factory Apr 16 23:25:39.900394 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.900310 2569 factory.go:223] Registration of the crio container factory successfully Apr 16 23:25:39.900394 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.900360 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 23:25:39.900394 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.900386 2569 factory.go:103] Registering Raw factory Apr 16 23:25:39.900539 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.900420 2569 manager.go:1196] Started watching for new ooms in manager Apr 16 23:25:39.900797 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.900766 2569 manager.go:319] Starting recovery of all containers Apr 16 23:25:39.901350 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.901330 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xvl5r" Apr 16 23:25:39.901692 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.901675 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 23:25:39.909477 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.909321 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xvl5r" Apr 16 23:25:39.910630 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.910618 2569 manager.go:324] Recovery completed Apr 16 23:25:39.914743 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.914731 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:25:39.917127 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.917112 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:25:39.917180 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.917139 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:25:39.917180 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.917149 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:25:39.917620 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.917609 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 23:25:39.917670 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.917620 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 23:25:39.917670 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.917637 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:25:39.918837 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.918777 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-147.ec2.internal.18a6f9f6722dfc78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-147.ec2.internal,UID:ip-10-0-136-147.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-147.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-147.ec2.internal,},FirstTimestamp:2026-04-16 23:25:39.917126776 +0000 UTC m=+0.518920973,LastTimestamp:2026-04-16 23:25:39.917126776 +0000 UTC m=+0.518920973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-147.ec2.internal,}" Apr 16 23:25:39.921648 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.921636 2569 policy_none.go:49] "None policy: Start" Apr 16 23:25:39.921691 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.921652 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 23:25:39.921691 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.921662 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 16 23:25:39.964564 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.964545 2569 manager.go:341] "Starting Device Plugin manager" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.964576 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.964585 2569 server.go:85] "Starting device plugin registration server" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.964791 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.964803 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.964984 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.965060 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.965068 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.965573 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.965612 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.974503 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.975701 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.975751 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.975768 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.975775 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:39.975806 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 23:25:39.987602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:39.977878 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:25:40.065173 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.065151 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:25:40.066751 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.066733 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:25:40.066827 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.066766 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:25:40.066827 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.066780 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:25:40.066827 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.066803 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.074869 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.074846 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.074925 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.074871 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-147.ec2.internal\": node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.075913 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.075886 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal"] Apr 16 23:25:40.075982 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.075971 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:25:40.076864 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.076850 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:25:40.076931 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.076876 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:25:40.076931 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.076886 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:25:40.079126 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.079114 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:25:40.079286 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.079273 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.079321 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.079301 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:25:40.079971 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.079939 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:25:40.080041 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.079942 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:25:40.080041 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.079988 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:25:40.080041 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.080003 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:25:40.080041 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.080004 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:25:40.080041 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.080039 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:25:40.082193 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.082176 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.082298 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.082200 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:25:40.082847 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.082831 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:25:40.082935 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.082860 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:25:40.082935 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.082870 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:25:40.083155 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.083132 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.100791 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.100766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/575229c02bfdf674e77adf5c9c984b21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal\" (UID: \"575229c02bfdf674e77adf5c9c984b21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.100874 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.100796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a191383db2d48d2a4a230e7ba22ce803-config\") pod \"kube-apiserver-proxy-ip-10-0-136-147.ec2.internal\" (UID: \"a191383db2d48d2a4a230e7ba22ce803\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.100874 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.100816 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/575229c02bfdf674e77adf5c9c984b21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal\" (UID: \"575229c02bfdf674e77adf5c9c984b21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.101512 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.101495 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-147.ec2.internal\" not found" node="ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.105541 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.105526 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-147.ec2.internal\" not found" node="ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.183460 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.183430 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.201844 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.201822 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/575229c02bfdf674e77adf5c9c984b21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal\" (UID: \"575229c02bfdf674e77adf5c9c984b21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.201909 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.201853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/575229c02bfdf674e77adf5c9c984b21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal\" (UID: \"575229c02bfdf674e77adf5c9c984b21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.201909 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.201884 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a191383db2d48d2a4a230e7ba22ce803-config\") pod \"kube-apiserver-proxy-ip-10-0-136-147.ec2.internal\" (UID: \"a191383db2d48d2a4a230e7ba22ce803\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.202003 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.201919 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a191383db2d48d2a4a230e7ba22ce803-config\") pod \"kube-apiserver-proxy-ip-10-0-136-147.ec2.internal\" (UID: \"a191383db2d48d2a4a230e7ba22ce803\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.202003 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.201920 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/575229c02bfdf674e77adf5c9c984b21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal\" (UID: \"575229c02bfdf674e77adf5c9c984b21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.202003 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.201932 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/575229c02bfdf674e77adf5c9c984b21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal\" (UID: \"575229c02bfdf674e77adf5c9c984b21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.283998 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.283947 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.384840 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.384785 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.405228 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.405198 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.407998 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.407976 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" Apr 16 23:25:40.485202 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.485169 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.585745 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.585708 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.686275 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.686199 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.786779 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.786745 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.796871 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.796845 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 23:25:40.797072 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.797041 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:25:40.887705 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.887674 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.897683 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.897662 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 23:25:40.909120 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.909090 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:25:40.911910 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.911888 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 23:20:39 +0000 UTC" deadline="2027-10-07 15:38:42.642064483 +0000 UTC" Apr 16 23:25:40.911983 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.911909 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12928h13m1.730157981s" Apr 16 23:25:40.937123 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.937066 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t8zjh" Apr 16 23:25:40.944611 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.944587 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t8zjh" Apr 16 23:25:40.987760 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:40.987733 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:40.990998 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:40.990981 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:25:41.034065 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.034046 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:25:41.040886 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:41.040848 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda191383db2d48d2a4a230e7ba22ce803.slice/crio-f899f51908c3b0bdc4606d620994ceaaf9598a2480c69bf2b254f979ff6c0454 WatchSource:0}: Error finding container f899f51908c3b0bdc4606d620994ceaaf9598a2480c69bf2b254f979ff6c0454: Status 404 returned error can't find the container with id f899f51908c3b0bdc4606d620994ceaaf9598a2480c69bf2b254f979ff6c0454 Apr 16 23:25:41.046373 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.046360 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:25:41.088040 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:41.088009 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:41.188713 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:41.188640 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-147.ec2.internal\" not found" Apr 16 23:25:41.282687 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.282665 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:25:41.297666 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.297646 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" Apr 16 23:25:41.311059 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.311041 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:25:41.312057 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.312046 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" Apr 16 23:25:41.319651 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.319639 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:25:41.542366 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:41.542338 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod575229c02bfdf674e77adf5c9c984b21.slice/crio-9054e16603ffd2f494c3ec22951191b7e1014b494794447624a100c03e04523b WatchSource:0}: Error finding container 9054e16603ffd2f494c3ec22951191b7e1014b494794447624a100c03e04523b: Status 404 returned error can't find the container with id 9054e16603ffd2f494c3ec22951191b7e1014b494794447624a100c03e04523b Apr 16 23:25:41.873915 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.873800 2569 apiserver.go:52] "Watching apiserver" Apr 16 23:25:41.883132 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.883078 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 23:25:41.884509 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.884483 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-hc8fq","openshift-ovn-kubernetes/ovnkube-node-jvvbp","kube-system/konnectivity-agent-c8wlz","openshift-cluster-node-tuning-operator/tuned-cqhhq","openshift-image-registry/node-ca-cw7zr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal","openshift-multus/multus-additional-cni-plugins-8t8tf","openshift-multus/network-metrics-daemon-2mrw9","openshift-network-operator/iptables-alerter-vfhrc","kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k","openshift-multus/multus-vhf6l"] Apr 16 23:25:41.887053 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.887029 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:41.887155 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:41.887110 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:41.895583 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.895556 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.898308 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.897798 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:41.898308 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.898130 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 23:25:41.898308 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.898253 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 23:25:41.898308 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.898258 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mvt7v\"" Apr 16 23:25:41.898610 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.898593 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 23:25:41.899208 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.898899 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 23:25:41.899294 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.899195 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 23:25:41.899294 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.899236 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 23:25:41.900191 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.900172 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rbrlz\"" Apr 16 23:25:41.900409 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.900391 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 23:25:41.900518 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.900507 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 23:25:41.902262 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.902242 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.902345 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.902334 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:41.904525 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.904487 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 23:25:41.904767 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.904749 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:25:41.904866 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.904783 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pkmwl\"" Apr 16 23:25:41.905274 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.905038 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 23:25:41.905274 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.905069 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-q6zqf\"" Apr 16 23:25:41.905274 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.905108 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 23:25:41.905468 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.905340 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 23:25:41.905468 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.905408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:41.907577 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.907539 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 23:25:41.907707 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.907648 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:41.907808 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:41.907721 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:41.908031 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.908014 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 23:25:41.908119 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.908107 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 23:25:41.908178 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.908156 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 23:25:41.908232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.908206 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 23:25:41.908400 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.908383 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wwdv4\"" Apr 16 23:25:41.910691 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-tmp\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.910691 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.910691 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-cni-netd\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.910691 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910682 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-ovnkube-config\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910726 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-var-lib-kubelet\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910754 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-host\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-systemd\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910846 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910871 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-systemd\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-sys\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-etc-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910932 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-run\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-tuned\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910976 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f52qk\" (UniqueName: \"kubernetes.io/projected/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-kube-api-access-f52qk\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.910998 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-kubelet\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911021 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-var-lib-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911044 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0-konnectivity-ca\") pod \"konnectivity-agent-c8wlz\" (UID: \"f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0\") " pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911097 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-modprobe-d\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911119 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-kubernetes\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911138 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-log-socket\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-env-overrides\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911218 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e35128f8-82c9-4513-a410-0656f5f37ece-ovn-node-metrics-cert\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911238 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-ovnkube-script-lib\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-ovn\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911276 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-node-log\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-cni-bin\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911304 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgr4q\" (UniqueName: \"kubernetes.io/projected/e35128f8-82c9-4513-a410-0656f5f37ece-kube-api-access-xgr4q\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysconfig\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911402 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysctl-d\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911427 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysctl-conf\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911452 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-run-netns\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911475 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0-agent-certs\") pod \"konnectivity-agent-c8wlz\" (UID: \"f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0\") " pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:41.911710 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911501 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-systemd-units\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.912444 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911525 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-slash\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:41.912444 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.911547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-lib-modules\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:41.912444 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.912157 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:41.914527 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.914501 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 23:25:41.914527 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.914516 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:25:41.914747 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.914592 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 23:25:41.914747 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.914703 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7lzd4\"" Apr 16 23:25:41.914953 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.914936 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:41.917151 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.917131 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 23:25:41.917444 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.917285 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vhf6l" Apr 16 23:25:41.917598 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.917578 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 23:25:41.917713 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.917692 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 23:25:41.917982 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.917950 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4d7bt\"" Apr 16 23:25:41.919367 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.919350 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 23:25:41.919737 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.919722 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rq8fl\"" Apr 16 23:25:41.945387 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.945354 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:20:40 +0000 UTC" deadline="2027-10-08 23:57:05.461517647 +0000 UTC" Apr 16 23:25:41.945387 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.945386 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12960h31m23.516135266s" Apr 16 23:25:41.980754 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.980711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" event={"ID":"575229c02bfdf674e77adf5c9c984b21","Type":"ContainerStarted","Data":"9054e16603ffd2f494c3ec22951191b7e1014b494794447624a100c03e04523b"} Apr 16 23:25:41.981618 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.981595 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" event={"ID":"a191383db2d48d2a4a230e7ba22ce803","Type":"ContainerStarted","Data":"f899f51908c3b0bdc4606d620994ceaaf9598a2480c69bf2b254f979ff6c0454"} Apr 16 23:25:41.998171 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:41.998149 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 23:25:42.011853 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.011825 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgr4q\" (UniqueName: \"kubernetes.io/projected/e35128f8-82c9-4513-a410-0656f5f37ece-kube-api-access-xgr4q\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.011983 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.011868 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-os-release\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.011983 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.011897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:42.011983 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.011923 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8sxd\" (UniqueName: \"kubernetes.io/projected/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-kube-api-access-x8sxd\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.011983 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.011947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.012196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012003 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-hostroot\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.012196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysctl-conf\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.012196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012059 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-run-netns\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012099 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6rd\" (UniqueName: \"kubernetes.io/projected/621008c7-04c4-43b9-87b0-8ba9013bdecc-kube-api-access-gt6rd\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.012196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-socket-dir-parent\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.012196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012162 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8492m\" (UniqueName: \"kubernetes.io/projected/57932062-ba33-4931-9e05-3612d8392b49-kube-api-access-8492m\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.012196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012152 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-run-netns\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012189 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-systemd-units\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-slash\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012177 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysctl-conf\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012254 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/621008c7-04c4-43b9-87b0-8ba9013bdecc-host\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012240 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-systemd-units\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012273 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-slash\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/621008c7-04c4-43b9-87b0-8ba9013bdecc-serviceca\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-tmp\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-cni-netd\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-ovnkube-config\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-host\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012397 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-cnibin\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012442 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-host\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012456 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-cni-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-cni-netd\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.012478 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012486 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-sys\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-etc-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012531 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-sys\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012537 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwkww\" (UniqueName: \"kubernetes.io/projected/be198f06-cdcd-4d45-84cb-08bf655ad486-kube-api-access-rwkww\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f52qk\" (UniqueName: \"kubernetes.io/projected/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-kube-api-access-f52qk\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012575 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-etc-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-system-cni-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-run-ovn-kubernetes\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-cnibin\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012668 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-os-release\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012670 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012690 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-conf-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012716 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-modprobe-d\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-kubernetes\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-env-overrides\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-ovnkube-script-lib\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-kubernetes\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-modprobe-d\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012848 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-cni-bin\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012874 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-ovn\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012898 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-ovnkube-config\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012923 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-node-log\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012986 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-node-log\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.012985 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-iptables-alerter-script\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013014 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-ovn\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013045 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-device-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013065 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-sys-fs\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013094 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss2kf\" (UniqueName: \"kubernetes.io/projected/bc508e67-73e5-4aac-af7a-552184ffbe0a-kube-api-access-ss2kf\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/57932062-ba33-4931-9e05-3612d8392b49-multus-daemon-config\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysconfig\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013180 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysctl-d\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013226 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0-agent-certs\") pod \"konnectivity-agent-c8wlz\" (UID: \"f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0\") " pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:42.013908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013269 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysconfig\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013351 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-ovnkube-script-lib\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013382 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-sysctl-d\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013388 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013416 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-host-slash\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-cni-multus\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-lib-modules\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57932062-ba33-4931-9e05-3612d8392b49-cni-binary-copy\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-netns\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013545 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-etc-kubernetes\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-var-lib-kubelet\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013613 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-lib-modules\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-var-lib-kubelet\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013615 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-systemd\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-run-systemd\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.014705 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013664 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-cni-binary-copy\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013712 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-socket-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013728 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-systemd\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013727 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013743 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-system-cni-dir\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013765 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-kubelet\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-multus-certs\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013800 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-systemd\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013810 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-run\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013833 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-tuned\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013910 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e35128f8-82c9-4513-a410-0656f5f37ece-env-overrides\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-kubelet\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013919 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-run\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.013953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-var-lib-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014004 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-kubelet\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-var-lib-openvswitch\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.015466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0-konnectivity-ca\") pod \"konnectivity-agent-c8wlz\" (UID: \"f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0\") " pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014122 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-etc-selinux\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014148 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbnj\" (UniqueName: \"kubernetes.io/projected/28beae99-07bd-4677-b7d1-d83bd564ca27-kube-api-access-qvbnj\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014189 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-log-socket\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014216 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e35128f8-82c9-4513-a410-0656f5f37ece-ovn-node-metrics-cert\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-registration-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014266 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-log-socket\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014274 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-k8s-cni-cncf-io\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014326 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-cni-bin\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014404 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e35128f8-82c9-4513-a410-0656f5f37ece-host-cni-bin\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.014550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0-konnectivity-ca\") pod \"konnectivity-agent-c8wlz\" (UID: \"f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0\") " pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.016069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-etc-tuned\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.016117 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.016113 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-tmp\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.016476 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.016351 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e35128f8-82c9-4513-a410-0656f5f37ece-ovn-node-metrics-cert\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.016476 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.016393 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0-agent-certs\") pod \"konnectivity-agent-c8wlz\" (UID: \"f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0\") " pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:42.020660 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.020639 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:25:42.020660 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.020660 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:25:42.020798 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.020669 2569 projected.go:194] Error preparing data for projected volume kube-api-access-ng7xg for pod openshift-network-diagnostics/network-check-target-hc8fq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:42.020798 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.020730 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg podName:ca2349c9-d7e3-465e-a9da-79633f8e3aaa nodeName:}" failed. No retries permitted until 2026-04-16 23:25:42.520707683 +0000 UTC m=+3.122501872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ng7xg" (UniqueName: "kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg") pod "network-check-target-hc8fq" (UID: "ca2349c9-d7e3-465e-a9da-79633f8e3aaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:42.023029 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.022991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f52qk\" (UniqueName: \"kubernetes.io/projected/4babecb1-0de5-498a-a5b6-43afcbc2a4e7-kube-api-access-f52qk\") pod \"tuned-cqhhq\" (UID: \"4babecb1-0de5-498a-a5b6-43afcbc2a4e7\") " pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.023384 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.023369 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgr4q\" (UniqueName: \"kubernetes.io/projected/e35128f8-82c9-4513-a410-0656f5f37ece-kube-api-access-xgr4q\") pod \"ovnkube-node-jvvbp\" (UID: \"e35128f8-82c9-4513-a410-0656f5f37ece\") " pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.039643 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.039621 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:25:42.115169 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115133 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwkww\" (UniqueName: \"kubernetes.io/projected/be198f06-cdcd-4d45-84cb-08bf655ad486-kube-api-access-rwkww\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.115330 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115180 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-system-cni-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115330 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-cnibin\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115330 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-os-release\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115330 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115256 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-conf-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115330 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.115330 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115310 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-cni-bin\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-iptables-alerter-script\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115358 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-device-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-sys-fs\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115404 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss2kf\" (UniqueName: \"kubernetes.io/projected/bc508e67-73e5-4aac-af7a-552184ffbe0a-kube-api-access-ss2kf\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-os-release\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/57932062-ba33-4931-9e05-3612d8392b49-multus-daemon-config\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115462 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-conf-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115493 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-system-cni-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115531 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-device-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-host-slash\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115571 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-cni-bin\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.115595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115580 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-cni-multus\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115609 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57932062-ba33-4931-9e05-3612d8392b49-cni-binary-copy\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-netns\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115659 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-etc-kubernetes\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115667 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-cnibin\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115689 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-cni-binary-copy\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115706 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-cni-multus\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115714 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-socket-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-system-cni-dir\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115795 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-kubelet\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115842 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-multus-certs\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115873 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115901 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-etc-selinux\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbnj\" (UniqueName: \"kubernetes.io/projected/28beae99-07bd-4677-b7d1-d83bd564ca27-kube-api-access-qvbnj\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.115974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-registration-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.116259 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116019 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-host-slash\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116042 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-registration-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116057 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-iptables-alerter-script\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116063 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-k8s-cni-cncf-io\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116077 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-sys-fs\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-os-release\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116111 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-netns\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116123 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116160 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-etc-kubernetes\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116206 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8sxd\" (UniqueName: \"kubernetes.io/projected/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-kube-api-access-x8sxd\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-k8s-cni-cncf-io\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-hostroot\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116308 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6rd\" (UniqueName: \"kubernetes.io/projected/621008c7-04c4-43b9-87b0-8ba9013bdecc-kube-api-access-gt6rd\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/57932062-ba33-4931-9e05-3612d8392b49-cni-binary-copy\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.116217 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-etc-selinux\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.117032 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.116626 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:25:42.616605476 +0000 UTC m=+3.218399715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:42.117773 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116656 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/57932062-ba33-4931-9e05-3612d8392b49-multus-daemon-config\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.117773 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.117773 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-hostroot\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.117773 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116714 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.117773 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-system-cni-dir\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.117773 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116326 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-os-release\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.117773 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.116798 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-var-lib-kubelet\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.118204 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118180 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/be198f06-cdcd-4d45-84cb-08bf655ad486-cni-binary-copy\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.118256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118233 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-host-run-multus-certs\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.118661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bc508e67-73e5-4aac-af7a-552184ffbe0a-socket-dir\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.118661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118379 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-socket-dir-parent\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.118661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118414 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8492m\" (UniqueName: \"kubernetes.io/projected/57932062-ba33-4931-9e05-3612d8392b49-kube-api-access-8492m\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.118661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118445 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/621008c7-04c4-43b9-87b0-8ba9013bdecc-host\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.118661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/621008c7-04c4-43b9-87b0-8ba9013bdecc-serviceca\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.118661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-cnibin\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.118661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118553 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-cni-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.118661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-cni-dir\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.119024 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/57932062-ba33-4931-9e05-3612d8392b49-multus-socket-dir-parent\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.119024 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.118854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/621008c7-04c4-43b9-87b0-8ba9013bdecc-host\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.119949 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.119265 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/621008c7-04c4-43b9-87b0-8ba9013bdecc-serviceca\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.119949 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.119336 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/be198f06-cdcd-4d45-84cb-08bf655ad486-cnibin\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.126081 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.126026 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwkww\" (UniqueName: \"kubernetes.io/projected/be198f06-cdcd-4d45-84cb-08bf655ad486-kube-api-access-rwkww\") pod \"multus-additional-cni-plugins-8t8tf\" (UID: \"be198f06-cdcd-4d45-84cb-08bf655ad486\") " pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.126185 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.126087 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6rd\" (UniqueName: \"kubernetes.io/projected/621008c7-04c4-43b9-87b0-8ba9013bdecc-kube-api-access-gt6rd\") pod \"node-ca-cw7zr\" (UID: \"621008c7-04c4-43b9-87b0-8ba9013bdecc\") " pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.126832 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.126808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8sxd\" (UniqueName: \"kubernetes.io/projected/0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471-kube-api-access-x8sxd\") pod \"iptables-alerter-vfhrc\" (UID: \"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471\") " pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.126925 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.126813 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss2kf\" (UniqueName: \"kubernetes.io/projected/bc508e67-73e5-4aac-af7a-552184ffbe0a-kube-api-access-ss2kf\") pod \"aws-ebs-csi-driver-node-ffs2k\" (UID: \"bc508e67-73e5-4aac-af7a-552184ffbe0a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.127343 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.127319 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbnj\" (UniqueName: \"kubernetes.io/projected/28beae99-07bd-4677-b7d1-d83bd564ca27-kube-api-access-qvbnj\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:42.128150 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.128130 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8492m\" (UniqueName: \"kubernetes.io/projected/57932062-ba33-4931-9e05-3612d8392b49-kube-api-access-8492m\") pod \"multus-vhf6l\" (UID: \"57932062-ba33-4931-9e05-3612d8392b49\") " pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.207406 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.207375 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:25:42.214265 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.214240 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:42.223123 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.223084 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" Apr 16 23:25:42.228796 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.228772 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cw7zr" Apr 16 23:25:42.235395 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.235374 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" Apr 16 23:25:42.243046 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.243024 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vfhrc" Apr 16 23:25:42.250703 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.250686 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" Apr 16 23:25:42.256276 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.256258 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vhf6l" Apr 16 23:25:42.521287 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.521208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:42.521450 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.521396 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:25:42.521450 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.521419 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:25:42.521450 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.521432 2569 projected.go:194] Error preparing data for projected volume kube-api-access-ng7xg for pod openshift-network-diagnostics/network-check-target-hc8fq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:42.521570 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.521501 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg podName:ca2349c9-d7e3-465e-a9da-79633f8e3aaa nodeName:}" failed. No retries permitted until 2026-04-16 23:25:43.521479026 +0000 UTC m=+4.123273226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ng7xg" (UniqueName: "kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg") pod "network-check-target-hc8fq" (UID: "ca2349c9-d7e3-465e-a9da-79633f8e3aaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:42.560747 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:42.560711 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7cbe5f0_8f20_44e0_b2d5_f50ce28222f0.slice/crio-21613fc861b9b577a5a9fcc72532bd8f5f2097bc38a19fa5f96a4040e3ddc171 WatchSource:0}: Error finding container 21613fc861b9b577a5a9fcc72532bd8f5f2097bc38a19fa5f96a4040e3ddc171: Status 404 returned error can't find the container with id 21613fc861b9b577a5a9fcc72532bd8f5f2097bc38a19fa5f96a4040e3ddc171 Apr 16 23:25:42.561807 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:42.561781 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f0e8eaf_9df8_4e3c_a7dc_bc20280ed471.slice/crio-0a818235eede99afbf1ccea8b3ea559ec04fe3ac93cc04b411701719cfc2c179 WatchSource:0}: Error finding container 0a818235eede99afbf1ccea8b3ea559ec04fe3ac93cc04b411701719cfc2c179: Status 404 returned error can't find the container with id 0a818235eede99afbf1ccea8b3ea559ec04fe3ac93cc04b411701719cfc2c179 Apr 16 23:25:42.563874 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:42.563785 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc508e67_73e5_4aac_af7a_552184ffbe0a.slice/crio-f85db09a7f9263fe99819e8a1a1ae3b1de3f669b029312e851d0163cf4d9d5a5 WatchSource:0}: Error finding container f85db09a7f9263fe99819e8a1a1ae3b1de3f669b029312e851d0163cf4d9d5a5: Status 404 returned error can't find the container with id f85db09a7f9263fe99819e8a1a1ae3b1de3f669b029312e851d0163cf4d9d5a5 Apr 16 23:25:42.568716 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:42.568612 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57932062_ba33_4931_9e05_3612d8392b49.slice/crio-cb7470c5d42f215771b4728cbb1e60e3440eb2add3147b7c680976d33d20bb59 WatchSource:0}: Error finding container cb7470c5d42f215771b4728cbb1e60e3440eb2add3147b7c680976d33d20bb59: Status 404 returned error can't find the container with id cb7470c5d42f215771b4728cbb1e60e3440eb2add3147b7c680976d33d20bb59 Apr 16 23:25:42.569254 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:42.569231 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode35128f8_82c9_4513_a410_0656f5f37ece.slice/crio-51aab122eeff56a91f0ad04b301485836bf38410270d1556765f5cccc270e216 WatchSource:0}: Error finding container 51aab122eeff56a91f0ad04b301485836bf38410270d1556765f5cccc270e216: Status 404 returned error can't find the container with id 51aab122eeff56a91f0ad04b301485836bf38410270d1556765f5cccc270e216 Apr 16 23:25:42.571572 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:25:42.571551 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4babecb1_0de5_498a_a5b6_43afcbc2a4e7.slice/crio-24d7fa168c9db02befd00c2ecd69de8fac18fc2bb5aa4b72de4b0bf830d81879 WatchSource:0}: Error finding container 24d7fa168c9db02befd00c2ecd69de8fac18fc2bb5aa4b72de4b0bf830d81879: Status 404 returned error can't find the container with id 24d7fa168c9db02befd00c2ecd69de8fac18fc2bb5aa4b72de4b0bf830d81879 Apr 16 23:25:42.622357 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.622331 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:42.622494 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.622468 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:42.622544 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:42.622537 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:25:43.622518043 +0000 UTC m=+4.224312241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:42.946339 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.946310 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:20:40 +0000 UTC" deadline="2027-10-15 04:26:55.221382131 +0000 UTC" Apr 16 23:25:42.946339 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.946337 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13109h1m12.275047758s" Apr 16 23:25:42.984548 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.984404 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"51aab122eeff56a91f0ad04b301485836bf38410270d1556765f5cccc270e216"} Apr 16 23:25:42.992824 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.992795 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" event={"ID":"bc508e67-73e5-4aac-af7a-552184ffbe0a","Type":"ContainerStarted","Data":"f85db09a7f9263fe99819e8a1a1ae3b1de3f669b029312e851d0163cf4d9d5a5"} Apr 16 23:25:42.996746 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.996705 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c8wlz" event={"ID":"f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0","Type":"ContainerStarted","Data":"21613fc861b9b577a5a9fcc72532bd8f5f2097bc38a19fa5f96a4040e3ddc171"} Apr 16 23:25:42.998520 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.998495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" event={"ID":"4babecb1-0de5-498a-a5b6-43afcbc2a4e7","Type":"ContainerStarted","Data":"24d7fa168c9db02befd00c2ecd69de8fac18fc2bb5aa4b72de4b0bf830d81879"} Apr 16 23:25:42.999495 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:42.999475 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerStarted","Data":"1a44393b82a2a3381d7de66d778ea5ad5dffb5b5eac14075674c0620056fbd1a"} Apr 16 23:25:43.000475 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:43.000457 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vhf6l" event={"ID":"57932062-ba33-4931-9e05-3612d8392b49","Type":"ContainerStarted","Data":"cb7470c5d42f215771b4728cbb1e60e3440eb2add3147b7c680976d33d20bb59"} Apr 16 23:25:43.001522 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:43.001495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cw7zr" event={"ID":"621008c7-04c4-43b9-87b0-8ba9013bdecc","Type":"ContainerStarted","Data":"7d8ac411779e748b490e2d9845d72934af36abe939978196bef91b63b6f517ed"} Apr 16 23:25:43.002302 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:43.002277 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vfhrc" event={"ID":"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471","Type":"ContainerStarted","Data":"0a818235eede99afbf1ccea8b3ea559ec04fe3ac93cc04b411701719cfc2c179"} Apr 16 23:25:43.527530 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:43.527497 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:43.527754 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:43.527638 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:25:43.527754 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:43.527656 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:25:43.527754 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:43.527668 2569 projected.go:194] Error preparing data for projected volume kube-api-access-ng7xg for pod openshift-network-diagnostics/network-check-target-hc8fq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:43.527754 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:43.527717 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg podName:ca2349c9-d7e3-465e-a9da-79633f8e3aaa nodeName:}" failed. No retries permitted until 2026-04-16 23:25:45.527700981 +0000 UTC m=+6.129495179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ng7xg" (UniqueName: "kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg") pod "network-check-target-hc8fq" (UID: "ca2349c9-d7e3-465e-a9da-79633f8e3aaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:43.628238 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:43.628206 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:43.628396 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:43.628347 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:43.628449 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:43.628409 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:25:45.628390942 +0000 UTC m=+6.230185140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:43.981512 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:43.978574 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:43.981512 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:43.978691 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:43.981512 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:43.979135 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:43.981512 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:43.979230 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:44.008753 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:44.008719 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" event={"ID":"a191383db2d48d2a4a230e7ba22ce803","Type":"ContainerStarted","Data":"c81999d6e38884d6d62a8ce3b0cbc5679d72642ac4d988f1195233cc3cc8af30"} Apr 16 23:25:44.021349 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:44.021296 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-147.ec2.internal" podStartSLOduration=3.021278292 podStartE2EDuration="3.021278292s" podCreationTimestamp="2026-04-16 23:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:25:44.020684181 +0000 UTC m=+4.622478389" watchObservedRunningTime="2026-04-16 23:25:44.021278292 +0000 UTC m=+4.623072501" Apr 16 23:25:45.015796 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:45.015755 2569 generic.go:358] "Generic (PLEG): container finished" podID="575229c02bfdf674e77adf5c9c984b21" containerID="3b31f3019aa220c5cbac421c1a2ee4d7cff6bf15fde40f1052df8e791921e877" exitCode=0 Apr 16 23:25:45.016266 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:45.016049 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" event={"ID":"575229c02bfdf674e77adf5c9c984b21","Type":"ContainerDied","Data":"3b31f3019aa220c5cbac421c1a2ee4d7cff6bf15fde40f1052df8e791921e877"} Apr 16 23:25:45.548900 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:45.548861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:45.549123 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:45.549045 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:25:45.549123 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:45.549066 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:25:45.549123 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:45.549076 2569 projected.go:194] Error preparing data for projected volume kube-api-access-ng7xg for pod openshift-network-diagnostics/network-check-target-hc8fq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:45.549301 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:45.549132 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg podName:ca2349c9-d7e3-465e-a9da-79633f8e3aaa nodeName:}" failed. No retries permitted until 2026-04-16 23:25:49.549113568 +0000 UTC m=+10.150907756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ng7xg" (UniqueName: "kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg") pod "network-check-target-hc8fq" (UID: "ca2349c9-d7e3-465e-a9da-79633f8e3aaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:45.649651 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:45.649606 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:45.649806 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:45.649755 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:45.649883 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:45.649816 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:25:49.649797972 +0000 UTC m=+10.251592160 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:45.978712 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:45.978632 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:45.978868 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:45.978755 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:45.979191 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:45.979171 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:45.979302 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:45.979271 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:47.976150 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:47.976110 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:47.976629 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:47.976232 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:47.976694 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:47.976659 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:47.976803 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:47.976778 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:48.804160 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.804132 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5bhhr"] Apr 16 23:25:48.814218 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.814188 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.817377 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.817212 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-q2d9s\"" Apr 16 23:25:48.817377 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.817332 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 23:25:48.818919 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.818341 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 23:25:48.876632 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.876581 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b15b520-d13a-41b6-b06f-81365371c0a0-hosts-file\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.876632 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.876632 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b15b520-d13a-41b6-b06f-81365371c0a0-tmp-dir\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.876828 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.876658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgh4l\" (UniqueName: \"kubernetes.io/projected/1b15b520-d13a-41b6-b06f-81365371c0a0-kube-api-access-vgh4l\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.978326 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.978289 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b15b520-d13a-41b6-b06f-81365371c0a0-hosts-file\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.978772 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.978362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b15b520-d13a-41b6-b06f-81365371c0a0-tmp-dir\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.978772 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.978393 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgh4l\" (UniqueName: \"kubernetes.io/projected/1b15b520-d13a-41b6-b06f-81365371c0a0-kube-api-access-vgh4l\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.978772 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.978618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1b15b520-d13a-41b6-b06f-81365371c0a0-hosts-file\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.980653 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.978983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b15b520-d13a-41b6-b06f-81365371c0a0-tmp-dir\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:48.987248 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:48.987221 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgh4l\" (UniqueName: \"kubernetes.io/projected/1b15b520-d13a-41b6-b06f-81365371c0a0-kube-api-access-vgh4l\") pod \"node-resolver-5bhhr\" (UID: \"1b15b520-d13a-41b6-b06f-81365371c0a0\") " pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:49.128156 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:49.128028 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5bhhr" Apr 16 23:25:49.583621 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:49.583569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:49.583837 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:49.583725 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:25:49.583837 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:49.583744 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:25:49.583837 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:49.583757 2569 projected.go:194] Error preparing data for projected volume kube-api-access-ng7xg for pod openshift-network-diagnostics/network-check-target-hc8fq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:49.583837 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:49.583816 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg podName:ca2349c9-d7e3-465e-a9da-79633f8e3aaa nodeName:}" failed. No retries permitted until 2026-04-16 23:25:57.583798108 +0000 UTC m=+18.185592294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ng7xg" (UniqueName: "kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg") pod "network-check-target-hc8fq" (UID: "ca2349c9-d7e3-465e-a9da-79633f8e3aaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:49.684128 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:49.684087 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:49.684315 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:49.684244 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:49.684375 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:49.684320 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:25:57.684300984 +0000 UTC m=+18.286095180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:49.976825 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:49.976792 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:49.976981 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:49.976896 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:49.976981 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:49.976949 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:49.977129 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:49.977101 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:51.976199 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:51.976004 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:51.976592 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:51.976047 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:51.976592 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:51.976305 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:51.976592 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:51.976453 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:52.028639 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.028597 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" event={"ID":"4babecb1-0de5-498a-a5b6-43afcbc2a4e7","Type":"ContainerStarted","Data":"5d6ff53633f2d981ad8b47f26c6e9601af512adae622f8c8508fcdc5981d4ecc"} Apr 16 23:25:52.030296 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.030128 2569 generic.go:358] "Generic (PLEG): container finished" podID="be198f06-cdcd-4d45-84cb-08bf655ad486" containerID="4d7b942392a3eeb8705e4b305e2192bb357dd4199f885937a3c6877317337363" exitCode=0 Apr 16 23:25:52.030296 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.030249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerDied","Data":"4d7b942392a3eeb8705e4b305e2192bb357dd4199f885937a3c6877317337363"} Apr 16 23:25:52.031952 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.031919 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cw7zr" event={"ID":"621008c7-04c4-43b9-87b0-8ba9013bdecc","Type":"ContainerStarted","Data":"9468defb5fb84aafe9a35e4978afa8fd02649f051cf918e37a219f9a595450ff"} Apr 16 23:25:52.035206 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.035184 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" event={"ID":"575229c02bfdf674e77adf5c9c984b21","Type":"ContainerStarted","Data":"04ffd4e573351c0f7863a427d0d7876585e32364f000039cf9e2e68441ceb3ff"} Apr 16 23:25:52.037569 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.037543 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" event={"ID":"bc508e67-73e5-4aac-af7a-552184ffbe0a","Type":"ContainerStarted","Data":"956346c823fad41c5904395b1e552d184b2ddb224fccb40724ba33e46dacc00f"} Apr 16 23:25:52.039168 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.039141 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c8wlz" event={"ID":"f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0","Type":"ContainerStarted","Data":"1020a9fcc640532f0e12dc90ea6eef569c2ccd7483e34968067a9555cd9f4903"} Apr 16 23:25:52.040604 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.040582 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5bhhr" event={"ID":"1b15b520-d13a-41b6-b06f-81365371c0a0","Type":"ContainerStarted","Data":"2f50eeea74c5c6217f7ef89e7d7e2fa23d556943fee3b65d1c2ebf4f3caf1358"} Apr 16 23:25:52.040688 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.040612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5bhhr" event={"ID":"1b15b520-d13a-41b6-b06f-81365371c0a0","Type":"ContainerStarted","Data":"aff60872c215eb7993387293e07112a2c492fe7d0aeeb00a2f4ca8dd711cbf79"} Apr 16 23:25:52.042650 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.042603 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cqhhq" podStartSLOduration=3.481330985 podStartE2EDuration="12.042592166s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:25:42.592687832 +0000 UTC m=+3.194482016" lastFinishedPulling="2026-04-16 23:25:51.153948998 +0000 UTC m=+11.755743197" observedRunningTime="2026-04-16 23:25:52.041947635 +0000 UTC m=+12.643741841" watchObservedRunningTime="2026-04-16 23:25:52.042592166 +0000 UTC m=+12.644386371" Apr 16 23:25:52.052787 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.052740 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cw7zr" podStartSLOduration=3.483586672 podStartE2EDuration="12.05272847s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:25:42.567877443 +0000 UTC m=+3.169671628" lastFinishedPulling="2026-04-16 23:25:51.137019242 +0000 UTC m=+11.738813426" observedRunningTime="2026-04-16 23:25:52.052495933 +0000 UTC m=+12.654290142" watchObservedRunningTime="2026-04-16 23:25:52.05272847 +0000 UTC m=+12.654522676" Apr 16 23:25:52.104286 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.104243 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-147.ec2.internal" podStartSLOduration=11.104228443 podStartE2EDuration="11.104228443s" podCreationTimestamp="2026-04-16 23:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:25:52.103686454 +0000 UTC m=+12.705480663" watchObservedRunningTime="2026-04-16 23:25:52.104228443 +0000 UTC m=+12.706022649" Apr 16 23:25:52.117275 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.117222 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5bhhr" podStartSLOduration=4.117205657 podStartE2EDuration="4.117205657s" podCreationTimestamp="2026-04-16 23:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:25:52.11707408 +0000 UTC m=+12.718868288" watchObservedRunningTime="2026-04-16 23:25:52.117205657 +0000 UTC m=+12.718999867" Apr 16 23:25:52.128552 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.128510 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-c8wlz" podStartSLOduration=3.56390148 podStartE2EDuration="12.128498561s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:25:42.562855524 +0000 UTC m=+3.164649721" lastFinishedPulling="2026-04-16 23:25:51.127452613 +0000 UTC m=+11.729246802" observedRunningTime="2026-04-16 23:25:52.127996437 +0000 UTC m=+12.729790644" watchObservedRunningTime="2026-04-16 23:25:52.128498561 +0000 UTC m=+12.730292766" Apr 16 23:25:52.688531 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.688496 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:52.689277 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:52.689240 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:53.044328 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:53.044235 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vfhrc" event={"ID":"0f0e8eaf-9df8-4e3c-a7dc-bc20280ed471","Type":"ContainerStarted","Data":"074b696cd0023626e7380e02be088fcc51b9e939b546c4b1962d7c114801bdcd"} Apr 16 23:25:53.048343 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:53.048316 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:53.048467 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:53.048403 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-c8wlz" Apr 16 23:25:53.058171 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:53.058122 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vfhrc" podStartSLOduration=4.485176886 podStartE2EDuration="13.058107109s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:25:42.564376164 +0000 UTC m=+3.166170351" lastFinishedPulling="2026-04-16 23:25:51.137306372 +0000 UTC m=+11.739100574" observedRunningTime="2026-04-16 23:25:53.057954437 +0000 UTC m=+13.659748643" watchObservedRunningTime="2026-04-16 23:25:53.058107109 +0000 UTC m=+13.659901316" Apr 16 23:25:53.976621 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:53.976586 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:53.976781 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:53.976598 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:53.976781 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:53.976711 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:53.976781 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:53.976771 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:55.976136 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:55.976099 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:55.977010 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:55.976231 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:55.977010 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:55.976277 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:55.977010 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:55.976366 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:57.641520 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:57.641483 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:57.641926 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:57.641682 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:25:57.641926 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:57.641706 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:25:57.641926 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:57.641721 2569 projected.go:194] Error preparing data for projected volume kube-api-access-ng7xg for pod openshift-network-diagnostics/network-check-target-hc8fq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:57.641926 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:57.641786 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg podName:ca2349c9-d7e3-465e-a9da-79633f8e3aaa nodeName:}" failed. No retries permitted until 2026-04-16 23:26:13.64176544 +0000 UTC m=+34.243559627 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ng7xg" (UniqueName: "kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg") pod "network-check-target-hc8fq" (UID: "ca2349c9-d7e3-465e-a9da-79633f8e3aaa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:25:57.742297 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:57.742265 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:57.742474 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:57.742416 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:57.742527 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:57.742481 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:13.742466036 +0000 UTC m=+34.344260225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:25:57.976254 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:57.976189 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:57.976402 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:57.976294 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:25:57.976402 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:57.976349 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:57.976475 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:57.976429 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:59.976166 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:59.975909 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:25:59.977227 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:25:59.977065 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:25:59.977227 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:59.977172 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:25:59.977356 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:25:59.977263 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:26:00.059837 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.059629 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"1dec431bedd65a7c4eed0daa80b1801510bef55e0f6e08471f8d03aab1ed43b9"} Apr 16 23:26:00.061941 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.061909 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vhf6l" event={"ID":"57932062-ba33-4931-9e05-3612d8392b49","Type":"ContainerStarted","Data":"61d5e5ce457053023d920cdd2d1b7be56b83d7db2539a299713a3c52b09f8a64"} Apr 16 23:26:00.187151 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.187094 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vhf6l" podStartSLOduration=2.89513471 podStartE2EDuration="20.187071397s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:25:42.570238149 +0000 UTC m=+3.172032335" lastFinishedPulling="2026-04-16 23:25:59.862174831 +0000 UTC m=+20.463969022" observedRunningTime="2026-04-16 23:26:00.096104618 +0000 UTC m=+20.697898864" watchObservedRunningTime="2026-04-16 23:26:00.187071397 +0000 UTC m=+20.788865604" Apr 16 23:26:00.187425 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.187404 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lr65x"] Apr 16 23:26:00.190362 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.190342 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.190447 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:00.190407 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lr65x" podUID="11bac137-baa0-441f-9af0-85cedda59681" Apr 16 23:26:00.260677 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.260639 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/11bac137-baa0-441f-9af0-85cedda59681-kubelet-config\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.260875 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.260732 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.260875 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.260781 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/11bac137-baa0-441f-9af0-85cedda59681-dbus\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.361541 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.361389 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/11bac137-baa0-441f-9af0-85cedda59681-kubelet-config\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.361541 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.361475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.361541 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.361488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/11bac137-baa0-441f-9af0-85cedda59681-kubelet-config\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.361745 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.361556 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/11bac137-baa0-441f-9af0-85cedda59681-dbus\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.361745 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:00.361617 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:00.361745 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:00.361666 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret podName:11bac137-baa0-441f-9af0-85cedda59681 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:00.861648324 +0000 UTC m=+21.463442513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret") pod "global-pull-secret-syncer-lr65x" (UID: "11bac137-baa0-441f-9af0-85cedda59681") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:00.361859 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.361765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/11bac137-baa0-441f-9af0-85cedda59681-dbus\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.497031 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.497009 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 23:26:00.865582 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.865488 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:00.865737 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:00.865632 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:00.865737 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:00.865701 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret podName:11bac137-baa0-441f-9af0-85cedda59681 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:01.86568199 +0000 UTC m=+22.467476177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret") pod "global-pull-secret-syncer-lr65x" (UID: "11bac137-baa0-441f-9af0-85cedda59681") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:00.976975 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.976874 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T23:26:00.497029142Z","UUID":"80cdc22f-ed48-49a0-80fd-33e89326d572","Handler":null,"Name":"","Endpoint":""} Apr 16 23:26:00.980184 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.980155 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 23:26:00.980184 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:00.980182 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 23:26:01.066924 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.066887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"1b5b11edc4560f486039340308e8f72e2576757e2fbc7cd37a5ecbedecc58338"} Apr 16 23:26:01.066924 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.066927 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"3b55a8fbdf7d58f82cdfed3992249f4b6c8ddc8ea3ef1f330e923fe4a0bcdb9c"} Apr 16 23:26:01.067136 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.066944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"73f894a2dccb1a2ee9e08b2a97061f20dadb895d5c18f442be72c402d57f7bd0"} Apr 16 23:26:01.067136 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.066975 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"8747daeca6c586644932585e37011df3245a0aa1656dacceb15bcd28c40b098c"} Apr 16 23:26:01.067136 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.066990 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"d56af7a863c80a9106b63b50ff284085c8879b1e02e1f030a1fac21c51cabc12"} Apr 16 23:26:01.068527 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.068495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" event={"ID":"bc508e67-73e5-4aac-af7a-552184ffbe0a","Type":"ContainerStarted","Data":"c976955b8abf8f1271c70508dc5d8e4aff164efa5858e445b7c469bb90d3a808"} Apr 16 23:26:01.870637 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.870563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:01.870974 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:01.870681 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:01.870974 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:01.870749 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret podName:11bac137-baa0-441f-9af0-85cedda59681 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:03.87073313 +0000 UTC m=+24.472527317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret") pod "global-pull-secret-syncer-lr65x" (UID: "11bac137-baa0-441f-9af0-85cedda59681") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:01.976152 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.976116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:01.976333 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:01.976255 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:26:01.976333 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.976116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:01.976453 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:01.976349 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lr65x" podUID="11bac137-baa0-441f-9af0-85cedda59681" Apr 16 23:26:01.976453 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:01.976116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:01.976453 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:01.976444 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:26:02.072827 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:02.072793 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" event={"ID":"bc508e67-73e5-4aac-af7a-552184ffbe0a","Type":"ContainerStarted","Data":"fd185557d01c35b44ff81e4c476248c68a476608310c20850b778eb771a3deaa"} Apr 16 23:26:02.098111 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:02.098065 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ffs2k" podStartSLOduration=2.730804163 podStartE2EDuration="22.098050994s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:25:42.566876504 +0000 UTC m=+3.168670696" lastFinishedPulling="2026-04-16 23:26:01.934123339 +0000 UTC m=+22.535917527" observedRunningTime="2026-04-16 23:26:02.097683694 +0000 UTC m=+22.699477901" watchObservedRunningTime="2026-04-16 23:26:02.098050994 +0000 UTC m=+22.699845199" Apr 16 23:26:03.078302 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:03.078264 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"63da3793616cfb0c3146af1061b11a8d74822b5ad92d49560365b83c56afce98"} Apr 16 23:26:03.882183 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:03.882148 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:03.882382 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:03.882309 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:03.882382 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:03.882377 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret podName:11bac137-baa0-441f-9af0-85cedda59681 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:07.882359229 +0000 UTC m=+28.484153412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret") pod "global-pull-secret-syncer-lr65x" (UID: "11bac137-baa0-441f-9af0-85cedda59681") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:03.977016 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:03.976976 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:03.977016 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:03.977004 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:03.977255 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:03.977056 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:03.977255 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:03.977177 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lr65x" podUID="11bac137-baa0-441f-9af0-85cedda59681" Apr 16 23:26:03.977255 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:03.977235 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:26:03.977415 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:03.977287 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:26:05.083262 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:05.083006 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerStarted","Data":"a7db0e8aa48db78c4fef506b1faa2a34005255cac112f001b31b383b8db2cf6c"} Apr 16 23:26:05.086240 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:05.086204 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" event={"ID":"e35128f8-82c9-4513-a410-0656f5f37ece","Type":"ContainerStarted","Data":"8163d27eb8fed2051bb1c64dd479a107220ad22674b1551b3dec8dea00c2a35e"} Apr 16 23:26:05.086523 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:05.086501 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:26:05.100346 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:05.100322 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:26:05.129729 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:05.127951 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" podStartSLOduration=7.799560827 podStartE2EDuration="25.127932441s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:25:42.570701592 +0000 UTC m=+3.172495792" lastFinishedPulling="2026-04-16 23:25:59.89907322 +0000 UTC m=+20.500867406" observedRunningTime="2026-04-16 23:26:05.127128784 +0000 UTC m=+25.728922990" watchObservedRunningTime="2026-04-16 23:26:05.127932441 +0000 UTC m=+25.729726647" Apr 16 23:26:05.980537 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:05.980333 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:05.980537 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:05.980450 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lr65x" podUID="11bac137-baa0-441f-9af0-85cedda59681" Apr 16 23:26:05.980757 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:05.980583 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:05.980757 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:05.980665 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:26:05.980757 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:05.980706 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:05.980896 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:05.980783 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:26:06.089280 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.089244 2569 generic.go:358] "Generic (PLEG): container finished" podID="be198f06-cdcd-4d45-84cb-08bf655ad486" containerID="a7db0e8aa48db78c4fef506b1faa2a34005255cac112f001b31b383b8db2cf6c" exitCode=0 Apr 16 23:26:06.089649 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.089326 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerDied","Data":"a7db0e8aa48db78c4fef506b1faa2a34005255cac112f001b31b383b8db2cf6c"} Apr 16 23:26:06.089649 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.089423 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:26:06.089822 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.089800 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:26:06.104076 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.104056 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:26:06.896829 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.896262 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lr65x"] Apr 16 23:26:06.896829 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.896389 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:06.896829 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:06.896498 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lr65x" podUID="11bac137-baa0-441f-9af0-85cedda59681" Apr 16 23:26:06.900672 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.900597 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hc8fq"] Apr 16 23:26:06.900806 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.900692 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:06.900867 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:06.900847 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:26:06.901303 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.901278 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2mrw9"] Apr 16 23:26:06.901493 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:06.901368 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:06.901493 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:06.901471 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:26:07.091352 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:07.091327 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:26:07.913519 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:07.913489 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:07.913680 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:07.913595 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:07.913680 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:07.913639 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret podName:11bac137-baa0-441f-9af0-85cedda59681 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:15.913626284 +0000 UTC m=+36.515420468 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret") pod "global-pull-secret-syncer-lr65x" (UID: "11bac137-baa0-441f-9af0-85cedda59681") : object "kube-system"/"original-pull-secret" not registered Apr 16 23:26:08.094654 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:08.094609 2569 generic.go:358] "Generic (PLEG): container finished" podID="be198f06-cdcd-4d45-84cb-08bf655ad486" containerID="a568210fd7387531f1b1feb54fa7441328f878be8fc7bb0f3d112246589a29ac" exitCode=0 Apr 16 23:26:08.095071 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:08.094688 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerDied","Data":"a568210fd7387531f1b1feb54fa7441328f878be8fc7bb0f3d112246589a29ac"} Apr 16 23:26:08.095071 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:08.094855 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:26:08.976677 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:08.976584 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:08.976677 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:08.976609 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:08.976677 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:08.976626 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:08.976880 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:08.976699 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:26:08.976880 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:08.976807 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lr65x" podUID="11bac137-baa0-441f-9af0-85cedda59681" Apr 16 23:26:08.976880 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:08.976859 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:26:09.197370 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:09.197337 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:26:09.197810 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:09.197568 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:26:09.212453 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:09.212404 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" podUID="e35128f8-82c9-4513-a410-0656f5f37ece" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 23:26:09.221088 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:09.221054 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" podUID="e35128f8-82c9-4513-a410-0656f5f37ece" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 23:26:10.103694 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:10.103488 2569 generic.go:358] "Generic (PLEG): container finished" podID="be198f06-cdcd-4d45-84cb-08bf655ad486" containerID="a8f684d9328b1b2b902a0130171c7e8eec1a8681671044d3476ba02eb56a7a2e" exitCode=0 Apr 16 23:26:10.103851 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:10.103574 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerDied","Data":"a8f684d9328b1b2b902a0130171c7e8eec1a8681671044d3476ba02eb56a7a2e"} Apr 16 23:26:10.976937 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:10.976906 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:10.977495 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:10.976907 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:10.977495 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:10.977047 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hc8fq" podUID="ca2349c9-d7e3-465e-a9da-79633f8e3aaa" Apr 16 23:26:10.977495 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:10.976907 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:10.977495 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:10.977109 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:26:10.977495 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:10.977166 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lr65x" podUID="11bac137-baa0-441f-9af0-85cedda59681" Apr 16 23:26:12.750137 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.750107 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-147.ec2.internal" event="NodeReady" Apr 16 23:26:12.750664 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.750259 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 23:26:12.790777 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.790742 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh"] Apr 16 23:26:12.811839 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.811811 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-684587478b-ftwbv"] Apr 16 23:26:12.812020 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.811863 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:12.814335 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.814313 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 23:26:12.814466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.814337 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-75sgs\"" Apr 16 23:26:12.814466 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.814375 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 23:26:12.832768 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.832744 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh"] Apr 16 23:26:12.832892 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.832773 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9jbxk"] Apr 16 23:26:12.832892 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.832872 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.835160 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.835140 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 23:26:12.835344 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.835326 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 23:26:12.835344 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.835336 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xwc2q\"" Apr 16 23:26:12.835481 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.835335 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 23:26:12.841498 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.841480 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 23:26:12.849218 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.849200 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2gmq5"] Apr 16 23:26:12.849357 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.849342 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:12.851448 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.851426 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 23:26:12.851747 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.851717 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-89jck\"" Apr 16 23:26:12.851832 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.851754 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 23:26:12.869811 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.869788 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9jbxk"] Apr 16 23:26:12.869811 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.869814 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-684587478b-ftwbv"] Apr 16 23:26:12.870116 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.869825 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2gmq5"] Apr 16 23:26:12.870116 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.869945 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:12.872340 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.872323 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 23:26:12.872437 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.872327 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m6886\"" Apr 16 23:26:12.872824 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.872697 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 23:26:12.872824 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.872732 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 23:26:12.949685 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.949654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:12.949685 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.949692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a35ffddb-daf5-4419-8950-a0f8be5ddeba-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:12.949903 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.949711 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49d6804e-8384-4066-8d9e-897d85fc6a42-ca-trust-extracted\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.949903 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.949726 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-installation-pull-secrets\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.949903 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.949789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57zs2\" (UniqueName: \"kubernetes.io/projected/521d672a-517e-4c54-a3c8-a1af436fb79c-kube-api-access-57zs2\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:12.949903 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.949840 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-image-registry-private-configuration\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.950089 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.949920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5924z\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-kube-api-access-5924z\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.950089 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.949975 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.950089 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.950005 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:12.950089 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.950038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vs5n\" (UniqueName: \"kubernetes.io/projected/3711a978-4a77-4055-9046-6ebb4a5ffeb1-kube-api-access-4vs5n\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:12.950268 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.950096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/521d672a-517e-4c54-a3c8-a1af436fb79c-tmp-dir\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:12.950268 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.950130 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-certificates\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.950268 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.950157 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-trusted-ca\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.950268 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.950181 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:12.950268 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.950257 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/521d672a-517e-4c54-a3c8-a1af436fb79c-config-volume\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:12.950476 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.950323 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-bound-sa-token\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:12.976906 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.976873 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:12.977208 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.976875 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:12.977208 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.976878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:12.979848 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.979825 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:26:12.979981 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.979878 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-grjf2\"" Apr 16 23:26:12.979981 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.979883 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:26:12.980097 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.979985 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 23:26:12.980145 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.980104 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:26:12.980210 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:12.980179 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wl6xg\"" Apr 16 23:26:13.051267 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/521d672a-517e-4c54-a3c8-a1af436fb79c-config-volume\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:13.051421 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051332 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-bound-sa-token\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.051421 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:13.051421 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a35ffddb-daf5-4419-8950-a0f8be5ddeba-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:13.051576 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49d6804e-8384-4066-8d9e-897d85fc6a42-ca-trust-extracted\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.051576 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051460 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-installation-pull-secrets\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.051576 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051489 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57zs2\" (UniqueName: \"kubernetes.io/projected/521d672a-517e-4c54-a3c8-a1af436fb79c-kube-api-access-57zs2\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:13.051576 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-image-registry-private-configuration\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051576 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5924z\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-kube-api-access-5924z\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051630 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051655 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vs5n\" (UniqueName: \"kubernetes.io/projected/3711a978-4a77-4055-9046-6ebb4a5ffeb1-kube-api-access-4vs5n\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051686 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/521d672a-517e-4c54-a3c8-a1af436fb79c-tmp-dir\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051715 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-certificates\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051726 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/521d672a-517e-4c54-a3c8-a1af436fb79c-config-volume\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051739 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-trusted-ca\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.051766 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.051765 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.051853 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.051896 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.051921 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:13.551903457 +0000 UTC m=+34.153697644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.051977 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:13.551938342 +0000 UTC m=+34.153732540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.052088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/521d672a-517e-4c54-a3c8-a1af436fb79c-tmp-dir\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.052136 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.052150 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.052189 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:13.552175519 +0000 UTC m=+34.153969709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:26:13.052238 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.052247 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:26:13.052689 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.052282 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:26:13.552271616 +0000 UTC m=+34.154065803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:26:13.052689 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.052537 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-certificates\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.052805 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.052739 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49d6804e-8384-4066-8d9e-897d85fc6a42-ca-trust-extracted\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.052805 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.052784 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a35ffddb-daf5-4419-8950-a0f8be5ddeba-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:13.053554 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.053532 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-trusted-ca\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.056513 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.056493 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-image-registry-private-configuration\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.056610 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.056527 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-installation-pull-secrets\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.063073 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.063044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57zs2\" (UniqueName: \"kubernetes.io/projected/521d672a-517e-4c54-a3c8-a1af436fb79c-kube-api-access-57zs2\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:13.063579 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.063556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-bound-sa-token\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.063680 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.063654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5924z\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-kube-api-access-5924z\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.070650 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.070631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vs5n\" (UniqueName: \"kubernetes.io/projected/3711a978-4a77-4055-9046-6ebb4a5ffeb1-kube-api-access-4vs5n\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:13.555741 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.555694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.555792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.555821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.555844 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.555860 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.555923 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:26:14.555902411 +0000 UTC m=+35.157696599 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.555923 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.555937 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.555970 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.556002 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:13.556026 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.555973 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:14.555949694 +0000 UTC m=+35.157743877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:26:13.556498 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.556051 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:14.556033412 +0000 UTC m=+35.157827599 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:26:13.556498 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.556067 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:14.556058619 +0000 UTC m=+35.157852803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:26:13.657308 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.657274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:13.660300 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.660269 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng7xg\" (UniqueName: \"kubernetes.io/projected/ca2349c9-d7e3-465e-a9da-79633f8e3aaa-kube-api-access-ng7xg\") pod \"network-check-target-hc8fq\" (UID: \"ca2349c9-d7e3-465e-a9da-79633f8e3aaa\") " pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:13.758517 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.758475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:13.759024 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.758627 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:26:13.759024 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:13.758698 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:45.758681708 +0000 UTC m=+66.360475893 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : secret "metrics-daemon-secret" not found Apr 16 23:26:13.896841 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:13.896807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:14.085674 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:14.085506 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hc8fq"] Apr 16 23:26:14.088932 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:26:14.088902 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca2349c9_d7e3_465e_a9da_79633f8e3aaa.slice/crio-f85ab14eaf7d27cd1bc7be95234523aec66e9cbc87e61188212abb0cca2ff55a WatchSource:0}: Error finding container f85ab14eaf7d27cd1bc7be95234523aec66e9cbc87e61188212abb0cca2ff55a: Status 404 returned error can't find the container with id f85ab14eaf7d27cd1bc7be95234523aec66e9cbc87e61188212abb0cca2ff55a Apr 16 23:26:14.112081 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:14.112053 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hc8fq" event={"ID":"ca2349c9-d7e3-465e-a9da-79633f8e3aaa","Type":"ContainerStarted","Data":"f85ab14eaf7d27cd1bc7be95234523aec66e9cbc87e61188212abb0cca2ff55a"} Apr 16 23:26:14.565284 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:14.565250 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:14.565453 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:14.565315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:14.565453 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:14.565345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:14.565453 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:14.565378 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:14.565453 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565395 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:26:14.565453 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565409 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:26:14.565453 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565427 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:26:14.565453 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565453 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:26:16.565438949 +0000 UTC m=+37.167233134 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:26:14.565762 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565477 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:16.565460814 +0000 UTC m=+37.167255012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:26:14.565762 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565486 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:14.565762 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565490 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:14.565762 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565535 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:16.565523815 +0000 UTC m=+37.167318010 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:26:14.565762 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:14.565548 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:16.565542417 +0000 UTC m=+37.167336604 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:26:15.977174 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:15.977138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:15.981098 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:15.981072 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/11bac137-baa0-441f-9af0-85cedda59681-original-pull-secret\") pod \"global-pull-secret-syncer-lr65x\" (UID: \"11bac137-baa0-441f-9af0-85cedda59681\") " pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:15.988906 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:15.988880 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lr65x" Apr 16 23:26:16.583686 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:16.583609 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:16.583686 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:16.583649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:16.583899 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:16.583684 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:16.583899 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:16.583744 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:16.583899 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583771 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:26:16.583899 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583789 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:26:16.583899 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583821 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:16.583899 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583836 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:26:16.583899 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583787 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:16.583899 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583847 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:20.583829399 +0000 UTC m=+41.185623604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:26:16.584262 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583921 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:20.583898308 +0000 UTC m=+41.185692504 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:26:16.584262 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583943 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:26:20.583931145 +0000 UTC m=+41.185725332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:26:16.584262 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:16.583987 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:20.583976065 +0000 UTC m=+41.185770250 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:26:18.281880 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.281843 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k"] Apr 16 23:26:18.301492 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.301467 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k"] Apr 16 23:26:18.301492 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.301492 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8"] Apr 16 23:26:18.301671 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.301629 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" Apr 16 23:26:18.304697 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.304395 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 23:26:18.304697 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.304407 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 23:26:18.304697 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.304474 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-nfxlx\"" Apr 16 23:26:18.304697 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.304491 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 23:26:18.304697 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.304573 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 23:26:18.316209 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.316186 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf"] Apr 16 23:26:18.316448 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.316428 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.318767 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.318742 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 23:26:18.333830 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.333811 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8"] Apr 16 23:26:18.333935 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.333834 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf"] Apr 16 23:26:18.334022 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.333937 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.336225 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.336205 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 23:26:18.336309 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.336235 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 23:26:18.336309 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.336266 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 23:26:18.336309 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.336266 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 23:26:18.397784 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.397758 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6wq\" (UniqueName: \"kubernetes.io/projected/8132a25b-d82e-4af0-a75f-ff47604159a2-kube-api-access-pw6wq\") pod \"managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k\" (UID: \"8132a25b-d82e-4af0-a75f-ff47604159a2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" Apr 16 23:26:18.397914 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.397807 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-tmp\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.398003 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.397926 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q84m2\" (UniqueName: \"kubernetes.io/projected/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-kube-api-access-q84m2\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.398058 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.398014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8132a25b-d82e-4af0-a75f-ff47604159a2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k\" (UID: \"8132a25b-d82e-4af0-a75f-ff47604159a2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" Apr 16 23:26:18.398058 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.398041 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-klusterlet-config\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.499321 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499282 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q84m2\" (UniqueName: \"kubernetes.io/projected/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-kube-api-access-q84m2\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.499494 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499376 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7c4c348-989e-47d3-bc23-b520ff8bdca3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.499494 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499402 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.499494 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499425 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p55x7\" (UniqueName: \"kubernetes.io/projected/f7c4c348-989e-47d3-bc23-b520ff8bdca3-kube-api-access-p55x7\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.499494 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499448 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-ca\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.499659 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499488 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.499659 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-tmp\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.499659 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6wq\" (UniqueName: \"kubernetes.io/projected/8132a25b-d82e-4af0-a75f-ff47604159a2-kube-api-access-pw6wq\") pod \"managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k\" (UID: \"8132a25b-d82e-4af0-a75f-ff47604159a2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" Apr 16 23:26:18.499780 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499698 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-hub\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.499780 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499746 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8132a25b-d82e-4af0-a75f-ff47604159a2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k\" (UID: \"8132a25b-d82e-4af0-a75f-ff47604159a2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" Apr 16 23:26:18.499874 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499776 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-klusterlet-config\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.499989 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.499949 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-tmp\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.502356 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.502328 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-klusterlet-config\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.502356 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.502346 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8132a25b-d82e-4af0-a75f-ff47604159a2-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k\" (UID: \"8132a25b-d82e-4af0-a75f-ff47604159a2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" Apr 16 23:26:18.506693 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.506673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6wq\" (UniqueName: \"kubernetes.io/projected/8132a25b-d82e-4af0-a75f-ff47604159a2-kube-api-access-pw6wq\") pod \"managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k\" (UID: \"8132a25b-d82e-4af0-a75f-ff47604159a2\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" Apr 16 23:26:18.507014 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.506992 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q84m2\" (UniqueName: \"kubernetes.io/projected/6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752-kube-api-access-q84m2\") pod \"klusterlet-addon-workmgr-758f866c45-65zl8\" (UID: \"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.601137 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.601109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-hub\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.601266 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.601224 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7c4c348-989e-47d3-bc23-b520ff8bdca3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.601266 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.601249 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.601333 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.601272 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p55x7\" (UniqueName: \"kubernetes.io/projected/f7c4c348-989e-47d3-bc23-b520ff8bdca3-kube-api-access-p55x7\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.601491 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.601471 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-ca\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.601549 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.601504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.602114 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.602081 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f7c4c348-989e-47d3-bc23-b520ff8bdca3-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.603596 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.603566 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.603686 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.603634 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-hub\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.603686 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.603663 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-ca\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.613551 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.613533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f7c4c348-989e-47d3-bc23-b520ff8bdca3-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.615881 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.615864 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p55x7\" (UniqueName: \"kubernetes.io/projected/f7c4c348-989e-47d3-bc23-b520ff8bdca3-kube-api-access-p55x7\") pod \"cluster-proxy-proxy-agent-55cb6db555-xqbkf\" (UID: \"f7c4c348-989e-47d3-bc23-b520ff8bdca3\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.621720 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.621701 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" Apr 16 23:26:18.628359 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.628343 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:18.641893 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.641870 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:26:18.956700 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.956417 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lr65x"] Apr 16 23:26:18.961498 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:26:18.960721 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11bac137_baa0_441f_9af0_85cedda59681.slice/crio-df8fd146193f9941892c96c8fc07d8efbd884b9fe642b7894810db9f59bc8858 WatchSource:0}: Error finding container df8fd146193f9941892c96c8fc07d8efbd884b9fe642b7894810db9f59bc8858: Status 404 returned error can't find the container with id df8fd146193f9941892c96c8fc07d8efbd884b9fe642b7894810db9f59bc8858 Apr 16 23:26:18.966403 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.966358 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8"] Apr 16 23:26:18.969164 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:26:18.969141 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bcce6f3_56c5_4cf3_b0ed_7f90b4e39752.slice/crio-dc99e027b625e62562c87aafc7107f08cc7c54b4a506113b830672530d76b9ba WatchSource:0}: Error finding container dc99e027b625e62562c87aafc7107f08cc7c54b4a506113b830672530d76b9ba: Status 404 returned error can't find the container with id dc99e027b625e62562c87aafc7107f08cc7c54b4a506113b830672530d76b9ba Apr 16 23:26:18.988180 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.988154 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf"] Apr 16 23:26:18.990021 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:18.990003 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k"] Apr 16 23:26:18.992078 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:26:18.992051 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7c4c348_989e_47d3_bc23_b520ff8bdca3.slice/crio-b34032902bc036fcea9ca2b651ced1957c18ed4319989651a4950c1c4acee678 WatchSource:0}: Error finding container b34032902bc036fcea9ca2b651ced1957c18ed4319989651a4950c1c4acee678: Status 404 returned error can't find the container with id b34032902bc036fcea9ca2b651ced1957c18ed4319989651a4950c1c4acee678 Apr 16 23:26:19.003676 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:26:19.003653 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8132a25b_d82e_4af0_a75f_ff47604159a2.slice/crio-5ad1fe3c512ac59886a0a0b8b8d2550fd62fcc363bd35334eeff484bbf3c19b3 WatchSource:0}: Error finding container 5ad1fe3c512ac59886a0a0b8b8d2550fd62fcc363bd35334eeff484bbf3c19b3: Status 404 returned error can't find the container with id 5ad1fe3c512ac59886a0a0b8b8d2550fd62fcc363bd35334eeff484bbf3c19b3 Apr 16 23:26:19.121562 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:19.121476 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lr65x" event={"ID":"11bac137-baa0-441f-9af0-85cedda59681","Type":"ContainerStarted","Data":"df8fd146193f9941892c96c8fc07d8efbd884b9fe642b7894810db9f59bc8858"} Apr 16 23:26:19.122580 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:19.122536 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" event={"ID":"8132a25b-d82e-4af0-a75f-ff47604159a2","Type":"ContainerStarted","Data":"5ad1fe3c512ac59886a0a0b8b8d2550fd62fcc363bd35334eeff484bbf3c19b3"} Apr 16 23:26:19.123549 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:19.123518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" event={"ID":"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752","Type":"ContainerStarted","Data":"dc99e027b625e62562c87aafc7107f08cc7c54b4a506113b830672530d76b9ba"} Apr 16 23:26:19.125999 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:19.125947 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerStarted","Data":"24f1ef714b921ec3f3c166f09772fa18722a13c4171e300d9ea9162dadfda65d"} Apr 16 23:26:19.127306 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:19.127288 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hc8fq" event={"ID":"ca2349c9-d7e3-465e-a9da-79633f8e3aaa","Type":"ContainerStarted","Data":"61897e43c0cb31c1659478015c604622ad541ec62974cc5d09ea1b2ef21bc0ed"} Apr 16 23:26:19.127446 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:19.127429 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:26:19.128351 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:19.128329 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" event={"ID":"f7c4c348-989e-47d3-bc23-b520ff8bdca3","Type":"ContainerStarted","Data":"b34032902bc036fcea9ca2b651ced1957c18ed4319989651a4950c1c4acee678"} Apr 16 23:26:19.158599 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:19.158548 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hc8fq" podStartSLOduration=34.470724566 podStartE2EDuration="39.158535214s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:26:14.09070391 +0000 UTC m=+34.692498094" lastFinishedPulling="2026-04-16 23:26:18.778514533 +0000 UTC m=+39.380308742" observedRunningTime="2026-04-16 23:26:19.158249324 +0000 UTC m=+39.760043529" watchObservedRunningTime="2026-04-16 23:26:19.158535214 +0000 UTC m=+39.760329435" Apr 16 23:26:20.137641 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:20.137373 2569 generic.go:358] "Generic (PLEG): container finished" podID="be198f06-cdcd-4d45-84cb-08bf655ad486" containerID="24f1ef714b921ec3f3c166f09772fa18722a13c4171e300d9ea9162dadfda65d" exitCode=0 Apr 16 23:26:20.137641 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:20.137612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerDied","Data":"24f1ef714b921ec3f3c166f09772fa18722a13c4171e300d9ea9162dadfda65d"} Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:20.630538 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:20.630600 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:20.630631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:20.630661 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.630846 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.630908 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:28.630888589 +0000 UTC m=+49.232682773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.630995 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.631034 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:26:28.63102214 +0000 UTC m=+49.232816330 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.631142 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.631172 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:28.631161523 +0000 UTC m=+49.232955713 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.631235 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.631246 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:26:20.631884 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:20.631277 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:28.63126688 +0000 UTC m=+49.233061069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:26:21.143544 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:21.143506 2569 generic.go:358] "Generic (PLEG): container finished" podID="be198f06-cdcd-4d45-84cb-08bf655ad486" containerID="4eeee502d5b7e9b02ff15071142b856b058d7a0bb191055e727f3a9e13ed8344" exitCode=0 Apr 16 23:26:21.144169 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:21.143592 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerDied","Data":"4eeee502d5b7e9b02ff15071142b856b058d7a0bb191055e727f3a9e13ed8344"} Apr 16 23:26:23.151355 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:23.151126 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" event={"ID":"be198f06-cdcd-4d45-84cb-08bf655ad486","Type":"ContainerStarted","Data":"ac7cea97cb0a4dd9dc3d110ce4112c0cda9a705a59c90e60b1a16d32413d4fc3"} Apr 16 23:26:23.152593 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:23.152561 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" event={"ID":"f7c4c348-989e-47d3-bc23-b520ff8bdca3","Type":"ContainerStarted","Data":"acca0e3890faf298963849144b7e23880abd2c0d00b554c9a0e835ce85a36995"} Apr 16 23:26:23.153869 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:23.153845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" event={"ID":"8132a25b-d82e-4af0-a75f-ff47604159a2","Type":"ContainerStarted","Data":"31b5765164c132ef6ed8996838a6b17eeadd248c099fbc9bb0448fac3db4e830"} Apr 16 23:26:23.173087 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:23.173037 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8t8tf" podStartSLOduration=6.987274954 podStartE2EDuration="43.173021461s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:25:42.592769536 +0000 UTC m=+3.194563723" lastFinishedPulling="2026-04-16 23:26:18.778516042 +0000 UTC m=+39.380310230" observedRunningTime="2026-04-16 23:26:23.172258056 +0000 UTC m=+43.774052265" watchObservedRunningTime="2026-04-16 23:26:23.173021461 +0000 UTC m=+43.774815667" Apr 16 23:26:23.186110 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:23.186062 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" podStartSLOduration=1.548929934 podStartE2EDuration="5.186047646s" podCreationTimestamp="2026-04-16 23:26:18 +0000 UTC" firstStartedPulling="2026-04-16 23:26:19.005418998 +0000 UTC m=+39.607213194" lastFinishedPulling="2026-04-16 23:26:22.642536708 +0000 UTC m=+43.244330906" observedRunningTime="2026-04-16 23:26:23.185320908 +0000 UTC m=+43.787115117" watchObservedRunningTime="2026-04-16 23:26:23.186047646 +0000 UTC m=+43.787841854" Apr 16 23:26:28.170043 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.167195 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" event={"ID":"f7c4c348-989e-47d3-bc23-b520ff8bdca3","Type":"ContainerStarted","Data":"8d9f408afd9ea9dd1c72921061d07c76f631e81c5a9b759c9a17e9bc57943dd0"} Apr 16 23:26:28.170043 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.169186 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" event={"ID":"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752","Type":"ContainerStarted","Data":"884eef01b16c4e9d09e187f093ee092f82ee6535095f426a0be6f3ff5d9012a0"} Apr 16 23:26:28.170864 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.170815 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:28.171433 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.171405 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" podUID="6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.11:8000/readyz\": dial tcp 10.132.0.11:8000: connect: connection refused" Apr 16 23:26:28.188391 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.187571 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" podStartSLOduration=1.079614604 podStartE2EDuration="10.187554111s" podCreationTimestamp="2026-04-16 23:26:18 +0000 UTC" firstStartedPulling="2026-04-16 23:26:18.971431422 +0000 UTC m=+39.573225617" lastFinishedPulling="2026-04-16 23:26:28.079370931 +0000 UTC m=+48.681165124" observedRunningTime="2026-04-16 23:26:28.186602848 +0000 UTC m=+48.788397054" watchObservedRunningTime="2026-04-16 23:26:28.187554111 +0000 UTC m=+48.789348318" Apr 16 23:26:28.694085 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.694049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:28.694085 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.694090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.694111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:28.694131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694191 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694201 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694220 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694256 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694261 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:26:44.694245343 +0000 UTC m=+65.296039527 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694277 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:44.694270941 +0000 UTC m=+65.296065124 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694207 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694296 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:44.694281892 +0000 UTC m=+65.296076080 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:26:28.694335 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:28.694311 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:44.694304854 +0000 UTC m=+65.296099038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:26:29.173286 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:29.173242 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lr65x" event={"ID":"11bac137-baa0-441f-9af0-85cedda59681","Type":"ContainerStarted","Data":"706cf7438182f0be9df2d20c874b8eaeec8ded2ce6262bcc812ab2cb0fe3e1f2"} Apr 16 23:26:29.175035 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:29.175010 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" event={"ID":"f7c4c348-989e-47d3-bc23-b520ff8bdca3","Type":"ContainerStarted","Data":"c42bbab777279b5e6270fc47d669aa4ecfe27bfa2bf5d9a69a1d5f804d658fe0"} Apr 16 23:26:29.175756 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:29.175739 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:26:29.200581 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:29.200537 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lr65x" podStartSLOduration=20.06634084 podStartE2EDuration="29.200524722s" podCreationTimestamp="2026-04-16 23:26:00 +0000 UTC" firstStartedPulling="2026-04-16 23:26:18.962918211 +0000 UTC m=+39.564712397" lastFinishedPulling="2026-04-16 23:26:28.097102082 +0000 UTC m=+48.698896279" observedRunningTime="2026-04-16 23:26:29.186698826 +0000 UTC m=+49.788493033" watchObservedRunningTime="2026-04-16 23:26:29.200524722 +0000 UTC m=+49.802318907" Apr 16 23:26:29.217449 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:29.217408 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" podStartSLOduration=2.127077398 podStartE2EDuration="11.217396672s" podCreationTimestamp="2026-04-16 23:26:18 +0000 UTC" firstStartedPulling="2026-04-16 23:26:18.994092386 +0000 UTC m=+39.595886570" lastFinishedPulling="2026-04-16 23:26:28.084411646 +0000 UTC m=+48.686205844" observedRunningTime="2026-04-16 23:26:29.21662325 +0000 UTC m=+49.818417456" watchObservedRunningTime="2026-04-16 23:26:29.217396672 +0000 UTC m=+49.819190878" Apr 16 23:26:39.222140 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:39.222108 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jvvbp" Apr 16 23:26:44.716092 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:44.716049 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:44.716153 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:44.716196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716209 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:44.716227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716276 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716311 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716326 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716339 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716282 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:27:16.716260938 +0000 UTC m=+97.318055133 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716374 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:27:16.716361661 +0000 UTC m=+97.318155845 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716387 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:16.716379878 +0000 UTC m=+97.318174062 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:26:44.716538 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:44.716403 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:16.716394819 +0000 UTC m=+97.318189004 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:26:45.824647 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:45.824609 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:26:45.825071 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:45.824716 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:26:45.825071 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:26:45.824767 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:49.824753837 +0000 UTC m=+130.426548021 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : secret "metrics-daemon-secret" not found Apr 16 23:26:50.140284 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:26:50.140254 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hc8fq" Apr 16 23:27:16.753378 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:27:16.753342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:27:16.753385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:27:16.753403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:27:16.753427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753507 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753512 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753519 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753575 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:20.7535597 +0000 UTC m=+161.355353888 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753581 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753510 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753595 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:28:20.753585489 +0000 UTC m=+161.355379673 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753624 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:20.753605862 +0000 UTC m=+161.355400049 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:27:16.753824 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:16.753645 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:20.753631059 +0000 UTC m=+161.355425243 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:27:49.901617 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:27:49.901565 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:27:49.902211 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:49.901688 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:27:49.902211 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:27:49.901752 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs podName:28beae99-07bd-4677-b7d1-d83bd564ca27 nodeName:}" failed. No retries permitted until 2026-04-16 23:29:51.901736812 +0000 UTC m=+252.503530996 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs") pod "network-metrics-daemon-2mrw9" (UID: "28beae99-07bd-4677-b7d1-d83bd564ca27") : secret "metrics-daemon-secret" not found Apr 16 23:28:04.317641 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:04.317615 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5bhhr_1b15b520-d13a-41b6-b06f-81365371c0a0/dns-node-resolver/0.log" Apr 16 23:28:04.917706 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:04.917673 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cw7zr_621008c7-04c4-43b9-87b0-8ba9013bdecc/node-ca/0.log" Apr 16 23:28:15.741057 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.741024 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cw66v"] Apr 16 23:28:15.744000 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.743981 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.746588 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.746560 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 23:28:15.747678 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.747660 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 23:28:15.747786 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.747660 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jrpnr\"" Apr 16 23:28:15.747786 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.747720 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 23:28:15.747786 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.747735 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 23:28:15.753101 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.753081 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cw66v"] Apr 16 23:28:15.783209 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.783185 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6a0fd6cc-6997-498d-84cd-310d912c1372-crio-socket\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.783304 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.783217 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6a0fd6cc-6997-498d-84cd-310d912c1372-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.783304 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.783247 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6a0fd6cc-6997-498d-84cd-310d912c1372-data-volume\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.783304 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.783271 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlw9h\" (UniqueName: \"kubernetes.io/projected/6a0fd6cc-6997-498d-84cd-310d912c1372-kube-api-access-tlw9h\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.783418 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.783331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.823312 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:15.823283 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" podUID="a35ffddb-daf5-4419-8950-a0f8be5ddeba" Apr 16 23:28:15.842249 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:15.842223 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-684587478b-ftwbv" podUID="49d6804e-8384-4066-8d9e-897d85fc6a42" Apr 16 23:28:15.859367 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:15.859340 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9jbxk" podUID="521d672a-517e-4c54-a3c8-a1af436fb79c" Apr 16 23:28:15.879745 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:15.879724 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2gmq5" podUID="3711a978-4a77-4055-9046-6ebb4a5ffeb1" Apr 16 23:28:15.883975 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.883940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6a0fd6cc-6997-498d-84cd-310d912c1372-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.884057 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.883996 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6a0fd6cc-6997-498d-84cd-310d912c1372-data-volume\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.884057 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.884015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlw9h\" (UniqueName: \"kubernetes.io/projected/6a0fd6cc-6997-498d-84cd-310d912c1372-kube-api-access-tlw9h\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.884057 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.884050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.884197 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.884097 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6a0fd6cc-6997-498d-84cd-310d912c1372-crio-socket\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.884197 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.884187 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6a0fd6cc-6997-498d-84cd-310d912c1372-crio-socket\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.884288 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:15.884192 2569 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:15.884288 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:15.884265 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls podName:6a0fd6cc-6997-498d-84cd-310d912c1372 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:16.384248604 +0000 UTC m=+156.986042788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls") pod "insights-runtime-extractor-cw66v" (UID: "6a0fd6cc-6997-498d-84cd-310d912c1372") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:15.884393 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.884374 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6a0fd6cc-6997-498d-84cd-310d912c1372-data-volume\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.884552 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.884533 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6a0fd6cc-6997-498d-84cd-310d912c1372-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:15.895091 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:15.895068 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlw9h\" (UniqueName: \"kubernetes.io/projected/6a0fd6cc-6997-498d-84cd-310d912c1372-kube-api-access-tlw9h\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:16.002192 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:16.002109 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2mrw9" podUID="28beae99-07bd-4677-b7d1-d83bd564ca27" Apr 16 23:28:16.389218 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:16.389185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:16.389362 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:16.389333 2569 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:16.389415 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:16.389405 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls podName:6a0fd6cc-6997-498d-84cd-310d912c1372 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:17.389389168 +0000 UTC m=+157.991183352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls") pod "insights-runtime-extractor-cw66v" (UID: "6a0fd6cc-6997-498d-84cd-310d912c1372") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:16.420566 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:16.420544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:28:16.420663 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:16.420544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9jbxk" Apr 16 23:28:16.420729 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:16.420545 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:28:17.399332 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:17.399283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:17.399709 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:17.399440 2569 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:17.399709 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:17.399505 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls podName:6a0fd6cc-6997-498d-84cd-310d912c1372 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:19.399489085 +0000 UTC m=+160.001283268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls") pod "insights-runtime-extractor-cw66v" (UID: "6a0fd6cc-6997-498d-84cd-310d912c1372") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:19.416269 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:19.416229 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:19.416658 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:19.416375 2569 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:19.416658 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:19.416439 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls podName:6a0fd6cc-6997-498d-84cd-310d912c1372 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:23.416423285 +0000 UTC m=+164.018217469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls") pod "insights-runtime-extractor-cw66v" (UID: "6a0fd6cc-6997-498d-84cd-310d912c1372") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:20.828812 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:20.828780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:20.828827 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.828923 2569 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.828986 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.828999 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert podName:a35ffddb-daf5-4419-8950-a0f8be5ddeba nodeName:}" failed. No retries permitted until 2026-04-16 23:30:22.828981482 +0000 UTC m=+283.430775671 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c6rh" (UID: "a35ffddb-daf5-4419-8950-a0f8be5ddeba") : secret "networking-console-plugin-cert" not found Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.829024 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert podName:3711a978-4a77-4055-9046-6ebb4a5ffeb1 nodeName:}" failed. No retries permitted until 2026-04-16 23:30:22.829012222 +0000 UTC m=+283.430806409 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert") pod "ingress-canary-2gmq5" (UID: "3711a978-4a77-4055-9046-6ebb4a5ffeb1") : secret "canary-serving-cert" not found Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:20.829040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") pod \"image-registry-684587478b-ftwbv\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:20.829070 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.829170 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.829176 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.829191 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-684587478b-ftwbv: secret "image-registry-tls" not found Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.829210 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls podName:521d672a-517e-4c54-a3c8-a1af436fb79c nodeName:}" failed. No retries permitted until 2026-04-16 23:30:22.829199098 +0000 UTC m=+283.430993294 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls") pod "dns-default-9jbxk" (UID: "521d672a-517e-4c54-a3c8-a1af436fb79c") : secret "dns-default-metrics-tls" not found Apr 16 23:28:20.829256 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:20.829227 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls podName:49d6804e-8384-4066-8d9e-897d85fc6a42 nodeName:}" failed. No retries permitted until 2026-04-16 23:30:22.829215472 +0000 UTC m=+283.431009676 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls") pod "image-registry-684587478b-ftwbv" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42") : secret "image-registry-tls" not found Apr 16 23:28:23.438894 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:23.438862 2569 generic.go:358] "Generic (PLEG): container finished" podID="8132a25b-d82e-4af0-a75f-ff47604159a2" containerID="31b5765164c132ef6ed8996838a6b17eeadd248c099fbc9bb0448fac3db4e830" exitCode=255 Apr 16 23:28:23.439263 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:23.438940 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" event={"ID":"8132a25b-d82e-4af0-a75f-ff47604159a2","Type":"ContainerDied","Data":"31b5765164c132ef6ed8996838a6b17eeadd248c099fbc9bb0448fac3db4e830"} Apr 16 23:28:23.439263 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:23.439243 2569 scope.go:117] "RemoveContainer" containerID="31b5765164c132ef6ed8996838a6b17eeadd248c099fbc9bb0448fac3db4e830" Apr 16 23:28:23.449013 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:23.448985 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:23.451330 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:23.451306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6a0fd6cc-6997-498d-84cd-310d912c1372-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cw66v\" (UID: \"6a0fd6cc-6997-498d-84cd-310d912c1372\") " pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:23.553012 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:23.552984 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cw66v" Apr 16 23:28:23.664075 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:23.664048 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cw66v"] Apr 16 23:28:23.666482 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:28:23.666455 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0fd6cc_6997_498d_84cd_310d912c1372.slice/crio-d5fbaaeb8b35b859d4659e747e65124a10fef1761d87012ac972f514c1595553 WatchSource:0}: Error finding container d5fbaaeb8b35b859d4659e747e65124a10fef1761d87012ac972f514c1595553: Status 404 returned error can't find the container with id d5fbaaeb8b35b859d4659e747e65124a10fef1761d87012ac972f514c1595553 Apr 16 23:28:24.443120 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:24.443086 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cw66v" event={"ID":"6a0fd6cc-6997-498d-84cd-310d912c1372","Type":"ContainerStarted","Data":"6ed347882906b4fb4c6cfd33be8148f5e8eeacbec47ddeaed30f03a04bc2820d"} Apr 16 23:28:24.443490 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:24.443127 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cw66v" event={"ID":"6a0fd6cc-6997-498d-84cd-310d912c1372","Type":"ContainerStarted","Data":"09166d82a8ba0d93b20af2e1e8baa7297e3a35e95059cc8c831fa0026e80bade"} Apr 16 23:28:24.443490 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:24.443143 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cw66v" event={"ID":"6a0fd6cc-6997-498d-84cd-310d912c1372","Type":"ContainerStarted","Data":"d5fbaaeb8b35b859d4659e747e65124a10fef1761d87012ac972f514c1595553"} Apr 16 23:28:24.444545 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:24.444521 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7f9fc4ffcf-jrz4k" event={"ID":"8132a25b-d82e-4af0-a75f-ff47604159a2","Type":"ContainerStarted","Data":"a6be802a19825e773e8fab01608d15d775a58d15bec6d97dc70510d9d40876f1"} Apr 16 23:28:26.451115 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:26.451084 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cw66v" event={"ID":"6a0fd6cc-6997-498d-84cd-310d912c1372","Type":"ContainerStarted","Data":"01485466cb3d48f27b7e7a8ed4bde400bab3a1efba3f0634ff25071aa0ef1f81"} Apr 16 23:28:26.468031 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:26.467990 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cw66v" podStartSLOduration=9.394830937 podStartE2EDuration="11.467975614s" podCreationTimestamp="2026-04-16 23:28:15 +0000 UTC" firstStartedPulling="2026-04-16 23:28:23.719904103 +0000 UTC m=+164.321698287" lastFinishedPulling="2026-04-16 23:28:25.793048768 +0000 UTC m=+166.394842964" observedRunningTime="2026-04-16 23:28:26.466696826 +0000 UTC m=+167.068491032" watchObservedRunningTime="2026-04-16 23:28:26.467975614 +0000 UTC m=+167.069769815" Apr 16 23:28:28.456480 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:28.456448 2569 generic.go:358] "Generic (PLEG): container finished" podID="6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752" containerID="884eef01b16c4e9d09e187f093ee092f82ee6535095f426a0be6f3ff5d9012a0" exitCode=1 Apr 16 23:28:28.456825 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:28.456507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" event={"ID":"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752","Type":"ContainerDied","Data":"884eef01b16c4e9d09e187f093ee092f82ee6535095f426a0be6f3ff5d9012a0"} Apr 16 23:28:28.456892 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:28.456825 2569 scope.go:117] "RemoveContainer" containerID="884eef01b16c4e9d09e187f093ee092f82ee6535095f426a0be6f3ff5d9012a0" Apr 16 23:28:28.628882 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:28.628847 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:28:28.976358 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:28.976283 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:28:29.175939 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:29.175904 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:28:29.460661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:29.460632 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" event={"ID":"6bcce6f3-56c5-4cf3-b0ed-7f90b4e39752","Type":"ContainerStarted","Data":"e27bc22baeafe9dfb37dae8a0db8de64c2c008b5b4f9d72734c78501789a9918"} Apr 16 23:28:29.461086 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:29.460866 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:28:29.461503 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:29.461488 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758f866c45-65zl8" Apr 16 23:28:30.976763 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:30.976717 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:28:41.690526 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.690495 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2h6sb"] Apr 16 23:28:41.696325 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.696303 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.698786 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.698763 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 23:28:41.698786 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.698777 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 23:28:41.700185 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.700164 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 23:28:41.700185 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.700172 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 23:28:41.700318 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.700210 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 23:28:41.700318 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.700288 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 23:28:41.700578 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.700555 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sfntm\"" Apr 16 23:28:41.791918 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.791881 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkjwl\" (UniqueName: \"kubernetes.io/projected/f87142d4-7511-4ad7-8599-27eb607fb332-kube-api-access-lkjwl\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.791918 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.791918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-sys\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.792132 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.791993 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-textfile\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.792132 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.792023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-accelerators-collector-config\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.792132 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.792058 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-root\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.792132 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.792116 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.792254 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.792182 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-wtmp\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.792254 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.792209 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-tls\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.792254 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.792225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f87142d4-7511-4ad7-8599-27eb607fb332-metrics-client-ca\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892562 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892528 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-wtmp\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892672 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-tls\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892672 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892590 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f87142d4-7511-4ad7-8599-27eb607fb332-metrics-client-ca\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892672 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892612 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkjwl\" (UniqueName: \"kubernetes.io/projected/f87142d4-7511-4ad7-8599-27eb607fb332-kube-api-access-lkjwl\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892672 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-sys\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892672 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-textfile\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892672 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-accelerators-collector-config\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892918 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:41.892692 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 23:28:41.892918 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892752 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-sys\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892918 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:41.892776 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-tls podName:f87142d4-7511-4ad7-8599-27eb607fb332 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:42.392756002 +0000 UTC m=+182.994550206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-tls") pod "node-exporter-2h6sb" (UID: "f87142d4-7511-4ad7-8599-27eb607fb332") : secret "node-exporter-tls" not found Apr 16 23:28:41.892918 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892703 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-root\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892918 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892918 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892845 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-root\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.892918 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.892695 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-wtmp\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.893224 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.893044 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-textfile\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.893340 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.893312 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f87142d4-7511-4ad7-8599-27eb607fb332-metrics-client-ca\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.893451 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.893401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-accelerators-collector-config\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.895225 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.895209 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:41.903613 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:41.903592 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkjwl\" (UniqueName: \"kubernetes.io/projected/f87142d4-7511-4ad7-8599-27eb607fb332-kube-api-access-lkjwl\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:42.396426 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:42.396384 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-tls\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:42.398667 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:42.398633 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f87142d4-7511-4ad7-8599-27eb607fb332-node-exporter-tls\") pod \"node-exporter-2h6sb\" (UID: \"f87142d4-7511-4ad7-8599-27eb607fb332\") " pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:42.605749 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:42.605711 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2h6sb" Apr 16 23:28:42.613791 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:28:42.613761 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf87142d4_7511_4ad7_8599_27eb607fb332.slice/crio-c073ac7cc68ab25fa585b837079aacbe99252806212e39e3e93d5260081d30fb WatchSource:0}: Error finding container c073ac7cc68ab25fa585b837079aacbe99252806212e39e3e93d5260081d30fb: Status 404 returned error can't find the container with id c073ac7cc68ab25fa585b837079aacbe99252806212e39e3e93d5260081d30fb Apr 16 23:28:43.497810 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:43.497736 2569 generic.go:358] "Generic (PLEG): container finished" podID="f87142d4-7511-4ad7-8599-27eb607fb332" containerID="9bae903edb8ec8791d6583bda8d4b8585e6f40475161de2e65c3220702e5a50f" exitCode=0 Apr 16 23:28:43.497810 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:43.497778 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2h6sb" event={"ID":"f87142d4-7511-4ad7-8599-27eb607fb332","Type":"ContainerDied","Data":"9bae903edb8ec8791d6583bda8d4b8585e6f40475161de2e65c3220702e5a50f"} Apr 16 23:28:43.497810 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:43.497804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2h6sb" event={"ID":"f87142d4-7511-4ad7-8599-27eb607fb332","Type":"ContainerStarted","Data":"c073ac7cc68ab25fa585b837079aacbe99252806212e39e3e93d5260081d30fb"} Apr 16 23:28:44.501997 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:44.501943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2h6sb" event={"ID":"f87142d4-7511-4ad7-8599-27eb607fb332","Type":"ContainerStarted","Data":"a501bbbb253c1cd5fc251cb12cbd00ff48f59a142d6ce823d4d55f98e9ea91e7"} Apr 16 23:28:44.501997 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:44.502003 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2h6sb" event={"ID":"f87142d4-7511-4ad7-8599-27eb607fb332","Type":"ContainerStarted","Data":"5df8fc82d84a954652ebcc48d315c7fd66c1903d02c021d9ddc4b1eb2b152cbe"} Apr 16 23:28:44.523183 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:44.523131 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2h6sb" podStartSLOduration=2.897826047 podStartE2EDuration="3.523117858s" podCreationTimestamp="2026-04-16 23:28:41 +0000 UTC" firstStartedPulling="2026-04-16 23:28:42.615498649 +0000 UTC m=+183.217292833" lastFinishedPulling="2026-04-16 23:28:43.240790455 +0000 UTC m=+183.842584644" observedRunningTime="2026-04-16 23:28:44.522211769 +0000 UTC m=+185.124005974" watchObservedRunningTime="2026-04-16 23:28:44.523117858 +0000 UTC m=+185.124912061" Apr 16 23:28:56.464865 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.464831 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-684587478b-ftwbv"] Apr 16 23:28:56.465394 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:28:56.465092 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-684587478b-ftwbv" podUID="49d6804e-8384-4066-8d9e-897d85fc6a42" Apr 16 23:28:56.530574 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.530545 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:28:56.534535 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.534517 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:28:56.708424 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708391 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-bound-sa-token\") pod \"49d6804e-8384-4066-8d9e-897d85fc6a42\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " Apr 16 23:28:56.708424 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708431 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-installation-pull-secrets\") pod \"49d6804e-8384-4066-8d9e-897d85fc6a42\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " Apr 16 23:28:56.708680 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708463 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49d6804e-8384-4066-8d9e-897d85fc6a42-ca-trust-extracted\") pod \"49d6804e-8384-4066-8d9e-897d85fc6a42\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " Apr 16 23:28:56.708680 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708489 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-image-registry-private-configuration\") pod \"49d6804e-8384-4066-8d9e-897d85fc6a42\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " Apr 16 23:28:56.708680 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708602 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-certificates\") pod \"49d6804e-8384-4066-8d9e-897d85fc6a42\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " Apr 16 23:28:56.708680 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708647 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-trusted-ca\") pod \"49d6804e-8384-4066-8d9e-897d85fc6a42\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " Apr 16 23:28:56.708880 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708697 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5924z\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-kube-api-access-5924z\") pod \"49d6804e-8384-4066-8d9e-897d85fc6a42\" (UID: \"49d6804e-8384-4066-8d9e-897d85fc6a42\") " Apr 16 23:28:56.708880 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708846 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49d6804e-8384-4066-8d9e-897d85fc6a42-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "49d6804e-8384-4066-8d9e-897d85fc6a42" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:28:56.709011 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.708935 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "49d6804e-8384-4066-8d9e-897d85fc6a42" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:28:56.709097 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.709069 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-certificates\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:28:56.709097 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.709096 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49d6804e-8384-4066-8d9e-897d85fc6a42-ca-trust-extracted\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:28:56.709271 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.709126 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "49d6804e-8384-4066-8d9e-897d85fc6a42" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:28:56.710898 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.710876 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-kube-api-access-5924z" (OuterVolumeSpecName: "kube-api-access-5924z") pod "49d6804e-8384-4066-8d9e-897d85fc6a42" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42"). InnerVolumeSpecName "kube-api-access-5924z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:28:56.711002 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.710899 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "49d6804e-8384-4066-8d9e-897d85fc6a42" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:28:56.711002 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.710972 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "49d6804e-8384-4066-8d9e-897d85fc6a42" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:28:56.711002 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.710986 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "49d6804e-8384-4066-8d9e-897d85fc6a42" (UID: "49d6804e-8384-4066-8d9e-897d85fc6a42"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:28:56.810211 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.810133 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5924z\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-kube-api-access-5924z\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:28:56.810211 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.810158 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-bound-sa-token\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:28:56.810211 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.810169 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-installation-pull-secrets\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:28:56.810211 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.810178 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/49d6804e-8384-4066-8d9e-897d85fc6a42-image-registry-private-configuration\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:28:56.810211 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:56.810187 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49d6804e-8384-4066-8d9e-897d85fc6a42-trusted-ca\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:28:57.532546 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:57.532516 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-684587478b-ftwbv" Apr 16 23:28:57.562341 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:57.562315 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-684587478b-ftwbv"] Apr 16 23:28:57.565379 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:57.565356 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-684587478b-ftwbv"] Apr 16 23:28:57.718367 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:57.718330 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49d6804e-8384-4066-8d9e-897d85fc6a42-registry-tls\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:28:57.979863 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:28:57.979834 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d6804e-8384-4066-8d9e-897d85fc6a42" path="/var/lib/kubelet/pods/49d6804e-8384-4066-8d9e-897d85fc6a42/volumes" Apr 16 23:29:18.642840 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:18.642800 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" podUID="f7c4c348-989e-47d3-bc23-b520ff8bdca3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 23:29:28.642795 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:28.642754 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" podUID="f7c4c348-989e-47d3-bc23-b520ff8bdca3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 23:29:38.643521 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:38.643482 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" podUID="f7c4c348-989e-47d3-bc23-b520ff8bdca3" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 23:29:38.643885 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:38.643547 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" Apr 16 23:29:38.644035 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:38.644004 2569 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"c42bbab777279b5e6270fc47d669aa4ecfe27bfa2bf5d9a69a1d5f804d658fe0"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 23:29:38.644085 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:38.644067 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" podUID="f7c4c348-989e-47d3-bc23-b520ff8bdca3" containerName="service-proxy" containerID="cri-o://c42bbab777279b5e6270fc47d669aa4ecfe27bfa2bf5d9a69a1d5f804d658fe0" gracePeriod=30 Apr 16 23:29:39.638630 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:39.638596 2569 generic.go:358] "Generic (PLEG): container finished" podID="f7c4c348-989e-47d3-bc23-b520ff8bdca3" containerID="c42bbab777279b5e6270fc47d669aa4ecfe27bfa2bf5d9a69a1d5f804d658fe0" exitCode=2 Apr 16 23:29:39.638789 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:39.638644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" event={"ID":"f7c4c348-989e-47d3-bc23-b520ff8bdca3","Type":"ContainerDied","Data":"c42bbab777279b5e6270fc47d669aa4ecfe27bfa2bf5d9a69a1d5f804d658fe0"} Apr 16 23:29:39.638789 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:39.638669 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-55cb6db555-xqbkf" event={"ID":"f7c4c348-989e-47d3-bc23-b520ff8bdca3","Type":"ContainerStarted","Data":"cc70090f42bfefb21e97c8ed5c0ac2950806be96a066539512fa0808cc5ea78e"} Apr 16 23:29:51.925785 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:51.925738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:29:51.928004 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:51.927983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28beae99-07bd-4677-b7d1-d83bd564ca27-metrics-certs\") pod \"network-metrics-daemon-2mrw9\" (UID: \"28beae99-07bd-4677-b7d1-d83bd564ca27\") " pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:29:52.079329 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:52.079298 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wl6xg\"" Apr 16 23:29:52.087374 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:52.087348 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2mrw9" Apr 16 23:29:52.195196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:52.195112 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2mrw9"] Apr 16 23:29:52.197781 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:29:52.197757 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28beae99_07bd_4677_b7d1_d83bd564ca27.slice/crio-947724482adb014095b2d715eaf9c19fccf390586de4d6b1da12f2175a845181 WatchSource:0}: Error finding container 947724482adb014095b2d715eaf9c19fccf390586de4d6b1da12f2175a845181: Status 404 returned error can't find the container with id 947724482adb014095b2d715eaf9c19fccf390586de4d6b1da12f2175a845181 Apr 16 23:29:52.670377 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:52.670342 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2mrw9" event={"ID":"28beae99-07bd-4677-b7d1-d83bd564ca27","Type":"ContainerStarted","Data":"947724482adb014095b2d715eaf9c19fccf390586de4d6b1da12f2175a845181"} Apr 16 23:29:53.674834 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:53.674797 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2mrw9" event={"ID":"28beae99-07bd-4677-b7d1-d83bd564ca27","Type":"ContainerStarted","Data":"9aca81dd6a3997353d54d22d1ff15b3a5d091ad54b9eb1e6e84527f661b2e731"} Apr 16 23:29:53.674834 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:53.674840 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2mrw9" event={"ID":"28beae99-07bd-4677-b7d1-d83bd564ca27","Type":"ContainerStarted","Data":"9560dff5f1b34af576e0563a8afa954c9db7dd7f3fbdb06f74f66484310051d6"} Apr 16 23:29:53.689926 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:29:53.689885 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2mrw9" podStartSLOduration=252.591163114 podStartE2EDuration="4m13.689872683s" podCreationTimestamp="2026-04-16 23:25:40 +0000 UTC" firstStartedPulling="2026-04-16 23:29:52.199613847 +0000 UTC m=+252.801408030" lastFinishedPulling="2026-04-16 23:29:53.298323412 +0000 UTC m=+253.900117599" observedRunningTime="2026-04-16 23:29:53.688457795 +0000 UTC m=+254.290252000" watchObservedRunningTime="2026-04-16 23:29:53.689872683 +0000 UTC m=+254.291666942" Apr 16 23:30:19.421210 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:30:19.421152 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" podUID="a35ffddb-daf5-4419-8950-a0f8be5ddeba" Apr 16 23:30:19.421210 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:30:19.421152 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9jbxk" podUID="521d672a-517e-4c54-a3c8-a1af436fb79c" Apr 16 23:30:19.743900 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:19.743825 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9jbxk" Apr 16 23:30:19.744057 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:19.743825 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:30:22.858399 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.858356 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:30:22.858951 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.858411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:30:22.858951 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.858503 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:30:22.860784 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.860762 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a35ffddb-daf5-4419-8950-a0f8be5ddeba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c6rh\" (UID: \"a35ffddb-daf5-4419-8950-a0f8be5ddeba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:30:22.861146 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.861128 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/521d672a-517e-4c54-a3c8-a1af436fb79c-metrics-tls\") pod \"dns-default-9jbxk\" (UID: \"521d672a-517e-4c54-a3c8-a1af436fb79c\") " pod="openshift-dns/dns-default-9jbxk" Apr 16 23:30:22.861241 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.861224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3711a978-4a77-4055-9046-6ebb4a5ffeb1-cert\") pod \"ingress-canary-2gmq5\" (UID: \"3711a978-4a77-4055-9046-6ebb4a5ffeb1\") " pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:30:22.880232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.880203 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m6886\"" Apr 16 23:30:22.888209 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.888188 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2gmq5" Apr 16 23:30:22.996170 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:22.996144 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2gmq5"] Apr 16 23:30:22.998445 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:30:22.998404 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3711a978_4a77_4055_9046_6ebb4a5ffeb1.slice/crio-3168ac9a9e1a53ad922704896da2243f5e3a7565448bb08ee4ce6830a1a7a856 WatchSource:0}: Error finding container 3168ac9a9e1a53ad922704896da2243f5e3a7565448bb08ee4ce6830a1a7a856: Status 404 returned error can't find the container with id 3168ac9a9e1a53ad922704896da2243f5e3a7565448bb08ee4ce6830a1a7a856 Apr 16 23:30:23.046877 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.046851 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-89jck\"" Apr 16 23:30:23.047043 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.046881 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-75sgs\"" Apr 16 23:30:23.054798 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.054781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" Apr 16 23:30:23.054868 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.054803 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9jbxk" Apr 16 23:30:23.178460 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.178430 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh"] Apr 16 23:30:23.182943 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:30:23.182902 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda35ffddb_daf5_4419_8950_a0f8be5ddeba.slice/crio-e6e075bb5469c8d17e83b2c1eb1d71c6d0a87387372498f7899a340f390df088 WatchSource:0}: Error finding container e6e075bb5469c8d17e83b2c1eb1d71c6d0a87387372498f7899a340f390df088: Status 404 returned error can't find the container with id e6e075bb5469c8d17e83b2c1eb1d71c6d0a87387372498f7899a340f390df088 Apr 16 23:30:23.199266 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.197313 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9jbxk"] Apr 16 23:30:23.202908 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:30:23.202885 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod521d672a_517e_4c54_a3c8_a1af436fb79c.slice/crio-62000b0044a3d55c69a614d52cff0844aee4ecff0462722a9d0a354eeb905e53 WatchSource:0}: Error finding container 62000b0044a3d55c69a614d52cff0844aee4ecff0462722a9d0a354eeb905e53: Status 404 returned error can't find the container with id 62000b0044a3d55c69a614d52cff0844aee4ecff0462722a9d0a354eeb905e53 Apr 16 23:30:23.757947 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.757908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9jbxk" event={"ID":"521d672a-517e-4c54-a3c8-a1af436fb79c","Type":"ContainerStarted","Data":"62000b0044a3d55c69a614d52cff0844aee4ecff0462722a9d0a354eeb905e53"} Apr 16 23:30:23.759610 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.759520 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" event={"ID":"a35ffddb-daf5-4419-8950-a0f8be5ddeba","Type":"ContainerStarted","Data":"e6e075bb5469c8d17e83b2c1eb1d71c6d0a87387372498f7899a340f390df088"} Apr 16 23:30:23.761161 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:23.761127 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2gmq5" event={"ID":"3711a978-4a77-4055-9046-6ebb4a5ffeb1","Type":"ContainerStarted","Data":"3168ac9a9e1a53ad922704896da2243f5e3a7565448bb08ee4ce6830a1a7a856"} Apr 16 23:30:25.768103 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:25.768013 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" event={"ID":"a35ffddb-daf5-4419-8950-a0f8be5ddeba","Type":"ContainerStarted","Data":"4c8c206c307e94a0dcd9b50a0f7dde727fe5cd95f70c83d7ca2ebecf1d1dca41"} Apr 16 23:30:25.769350 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:25.769324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2gmq5" event={"ID":"3711a978-4a77-4055-9046-6ebb4a5ffeb1","Type":"ContainerStarted","Data":"c9834dd12ad9ea1668311163d086ea09c883a196776b3f9f476e8021f43475b3"} Apr 16 23:30:25.770599 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:25.770582 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9jbxk" event={"ID":"521d672a-517e-4c54-a3c8-a1af436fb79c","Type":"ContainerStarted","Data":"e516e6d4fe2d23587052590a03d8ac23fdca5789f3df8ce59b84a45549c5afa8"} Apr 16 23:30:25.770679 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:25.770604 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9jbxk" event={"ID":"521d672a-517e-4c54-a3c8-a1af436fb79c","Type":"ContainerStarted","Data":"7848ad6dc8894549185e30291f489394225aacad67197562a9401778c6b2db47"} Apr 16 23:30:25.770751 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:25.770739 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9jbxk" Apr 16 23:30:25.785534 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:25.785492 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c6rh" podStartSLOduration=266.809815867 podStartE2EDuration="4m28.785479733s" podCreationTimestamp="2026-04-16 23:25:57 +0000 UTC" firstStartedPulling="2026-04-16 23:30:23.184953031 +0000 UTC m=+283.786747215" lastFinishedPulling="2026-04-16 23:30:25.160616888 +0000 UTC m=+285.762411081" observedRunningTime="2026-04-16 23:30:25.784638749 +0000 UTC m=+286.386432954" watchObservedRunningTime="2026-04-16 23:30:25.785479733 +0000 UTC m=+286.387273939" Apr 16 23:30:25.802717 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:25.802669 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9jbxk" podStartSLOduration=251.844503522 podStartE2EDuration="4m13.802653069s" podCreationTimestamp="2026-04-16 23:26:12 +0000 UTC" firstStartedPulling="2026-04-16 23:30:23.20430333 +0000 UTC m=+283.806097516" lastFinishedPulling="2026-04-16 23:30:25.162452875 +0000 UTC m=+285.764247063" observedRunningTime="2026-04-16 23:30:25.801484902 +0000 UTC m=+286.403279108" watchObservedRunningTime="2026-04-16 23:30:25.802653069 +0000 UTC m=+286.404447276" Apr 16 23:30:25.816609 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:25.816571 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2gmq5" podStartSLOduration=251.649739771 podStartE2EDuration="4m13.816558222s" podCreationTimestamp="2026-04-16 23:26:12 +0000 UTC" firstStartedPulling="2026-04-16 23:30:23.000166388 +0000 UTC m=+283.601960583" lastFinishedPulling="2026-04-16 23:30:25.166984842 +0000 UTC m=+285.768779034" observedRunningTime="2026-04-16 23:30:25.815700667 +0000 UTC m=+286.417494875" watchObservedRunningTime="2026-04-16 23:30:25.816558222 +0000 UTC m=+286.418352428" Apr 16 23:30:35.775338 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:35.775308 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9jbxk" Apr 16 23:30:39.923767 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:30:39.923737 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 23:31:45.595818 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.595737 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7"] Apr 16 23:31:45.598653 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.598637 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" Apr 16 23:31:45.601090 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.601068 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 23:31:45.602003 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.601987 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-p8ksn\"" Apr 16 23:31:45.602003 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.601998 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:31:45.607712 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.607671 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7"] Apr 16 23:31:45.727257 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.727214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2bkn\" (UniqueName: \"kubernetes.io/projected/0476080a-4f5d-4f99-8ca4-ef93b1169b46-kube-api-access-c2bkn\") pod \"openshift-lws-operator-bfc7f696d-65rd7\" (UID: \"0476080a-4f5d-4f99-8ca4-ef93b1169b46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" Apr 16 23:31:45.727257 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.727258 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0476080a-4f5d-4f99-8ca4-ef93b1169b46-tmp\") pod \"openshift-lws-operator-bfc7f696d-65rd7\" (UID: \"0476080a-4f5d-4f99-8ca4-ef93b1169b46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" Apr 16 23:31:45.828214 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.828176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2bkn\" (UniqueName: \"kubernetes.io/projected/0476080a-4f5d-4f99-8ca4-ef93b1169b46-kube-api-access-c2bkn\") pod \"openshift-lws-operator-bfc7f696d-65rd7\" (UID: \"0476080a-4f5d-4f99-8ca4-ef93b1169b46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" Apr 16 23:31:45.828214 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.828219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0476080a-4f5d-4f99-8ca4-ef93b1169b46-tmp\") pod \"openshift-lws-operator-bfc7f696d-65rd7\" (UID: \"0476080a-4f5d-4f99-8ca4-ef93b1169b46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" Apr 16 23:31:45.828534 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.828519 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0476080a-4f5d-4f99-8ca4-ef93b1169b46-tmp\") pod \"openshift-lws-operator-bfc7f696d-65rd7\" (UID: \"0476080a-4f5d-4f99-8ca4-ef93b1169b46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" Apr 16 23:31:45.835901 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.835869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2bkn\" (UniqueName: \"kubernetes.io/projected/0476080a-4f5d-4f99-8ca4-ef93b1169b46-kube-api-access-c2bkn\") pod \"openshift-lws-operator-bfc7f696d-65rd7\" (UID: \"0476080a-4f5d-4f99-8ca4-ef93b1169b46\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" Apr 16 23:31:45.907681 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:45.907653 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" Apr 16 23:31:46.023542 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:46.022062 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7"] Apr 16 23:31:46.028713 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:31:46.028684 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0476080a_4f5d_4f99_8ca4_ef93b1169b46.slice/crio-d8da3684dd76968dbcf99aa1b4b11588cd3b40da73cb5f801142ce9c734cff45 WatchSource:0}: Error finding container d8da3684dd76968dbcf99aa1b4b11588cd3b40da73cb5f801142ce9c734cff45: Status 404 returned error can't find the container with id d8da3684dd76968dbcf99aa1b4b11588cd3b40da73cb5f801142ce9c734cff45 Apr 16 23:31:46.031916 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:46.031899 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:31:46.972664 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:46.972614 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" event={"ID":"0476080a-4f5d-4f99-8ca4-ef93b1169b46","Type":"ContainerStarted","Data":"d8da3684dd76968dbcf99aa1b4b11588cd3b40da73cb5f801142ce9c734cff45"} Apr 16 23:31:48.979425 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:48.979386 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" event={"ID":"0476080a-4f5d-4f99-8ca4-ef93b1169b46","Type":"ContainerStarted","Data":"e4a0e68cff2d6ada5cc6daa0f35d03d82ec449a60aeb314c983bc6fafb20503e"} Apr 16 23:31:48.994613 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:31:48.994567 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-65rd7" podStartSLOduration=1.486586888 podStartE2EDuration="3.99455391s" podCreationTimestamp="2026-04-16 23:31:45 +0000 UTC" firstStartedPulling="2026-04-16 23:31:46.032061292 +0000 UTC m=+366.633855476" lastFinishedPulling="2026-04-16 23:31:48.540028302 +0000 UTC m=+369.141822498" observedRunningTime="2026-04-16 23:31:48.993292298 +0000 UTC m=+369.595086505" watchObservedRunningTime="2026-04-16 23:31:48.99455391 +0000 UTC m=+369.596348116" Apr 16 23:32:08.196196 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.196159 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz"] Apr 16 23:32:08.199143 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.199123 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.202871 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.202848 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 23:32:08.203026 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.202867 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 23:32:08.203136 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.203116 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 23:32:08.203189 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.203117 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 23:32:08.203239 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.203138 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-v6gnw\"" Apr 16 23:32:08.217041 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.217019 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz"] Apr 16 23:32:08.294108 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.294073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbc0caec-b3c4-437a-a8cd-00e91951b67f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.294108 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.294111 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbc0caec-b3c4-437a-a8cd-00e91951b67f-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.294300 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.294132 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vc5t\" (UniqueName: \"kubernetes.io/projected/bbc0caec-b3c4-437a-a8cd-00e91951b67f-kube-api-access-2vc5t\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.395299 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.395268 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbc0caec-b3c4-437a-a8cd-00e91951b67f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.395299 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.395299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbc0caec-b3c4-437a-a8cd-00e91951b67f-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.395522 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.395321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vc5t\" (UniqueName: \"kubernetes.io/projected/bbc0caec-b3c4-437a-a8cd-00e91951b67f-kube-api-access-2vc5t\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.397780 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.397755 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbc0caec-b3c4-437a-a8cd-00e91951b67f-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.397879 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.397806 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbc0caec-b3c4-437a-a8cd-00e91951b67f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.415138 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.415117 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vc5t\" (UniqueName: \"kubernetes.io/projected/bbc0caec-b3c4-437a-a8cd-00e91951b67f-kube-api-access-2vc5t\") pod \"opendatahub-operator-controller-manager-8bf69b96d-sj6lz\" (UID: \"bbc0caec-b3c4-437a-a8cd-00e91951b67f\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.508721 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.508651 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:08.632634 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:08.632604 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz"] Apr 16 23:32:08.635402 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:32:08.635375 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbc0caec_b3c4_437a_a8cd_00e91951b67f.slice/crio-84e1e5a3c9270c3e05436777c85780ad69e082e44e8a25be85868a7f4411e132 WatchSource:0}: Error finding container 84e1e5a3c9270c3e05436777c85780ad69e082e44e8a25be85868a7f4411e132: Status 404 returned error can't find the container with id 84e1e5a3c9270c3e05436777c85780ad69e082e44e8a25be85868a7f4411e132 Apr 16 23:32:09.032598 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:09.032565 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" event={"ID":"bbc0caec-b3c4-437a-a8cd-00e91951b67f","Type":"ContainerStarted","Data":"84e1e5a3c9270c3e05436777c85780ad69e082e44e8a25be85868a7f4411e132"} Apr 16 23:32:12.041946 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:12.041908 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" event={"ID":"bbc0caec-b3c4-437a-a8cd-00e91951b67f","Type":"ContainerStarted","Data":"6b52772a8d166764c9af06cc79c71c9b1eff616f3735897436ed57a0236dd0a1"} Apr 16 23:32:12.042432 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:12.042096 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:12.061459 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:12.061400 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" podStartSLOduration=1.595827671 podStartE2EDuration="4.061381981s" podCreationTimestamp="2026-04-16 23:32:08 +0000 UTC" firstStartedPulling="2026-04-16 23:32:08.637167797 +0000 UTC m=+389.238961984" lastFinishedPulling="2026-04-16 23:32:11.102722106 +0000 UTC m=+391.704516294" observedRunningTime="2026-04-16 23:32:12.059888684 +0000 UTC m=+392.661682900" watchObservedRunningTime="2026-04-16 23:32:12.061381981 +0000 UTC m=+392.663176188" Apr 16 23:32:23.047249 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:23.047218 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-sj6lz" Apr 16 23:32:26.296017 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.295981 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-577868f455-6t5sg"] Apr 16 23:32:26.299096 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.299077 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.301778 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.301756 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 23:32:26.301890 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.301839 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 23:32:26.301890 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.301852 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 23:32:26.303054 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.303032 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-245l5\"" Apr 16 23:32:26.303149 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.303088 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 23:32:26.310123 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.310093 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-577868f455-6t5sg"] Apr 16 23:32:26.319778 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.319741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/995cc232-3ec9-4d93-9d5c-b77cbca875ec-tmp\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.319881 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.319846 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/995cc232-3ec9-4d93-9d5c-b77cbca875ec-tls-certs\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.319947 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.319896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-694mb\" (UniqueName: \"kubernetes.io/projected/995cc232-3ec9-4d93-9d5c-b77cbca875ec-kube-api-access-694mb\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.420250 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.420216 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/995cc232-3ec9-4d93-9d5c-b77cbca875ec-tls-certs\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.420250 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.420257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-694mb\" (UniqueName: \"kubernetes.io/projected/995cc232-3ec9-4d93-9d5c-b77cbca875ec-kube-api-access-694mb\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.420501 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.420301 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/995cc232-3ec9-4d93-9d5c-b77cbca875ec-tmp\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.422537 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.422514 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/995cc232-3ec9-4d93-9d5c-b77cbca875ec-tmp\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.422740 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.422722 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/995cc232-3ec9-4d93-9d5c-b77cbca875ec-tls-certs\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.427880 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.427848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-694mb\" (UniqueName: \"kubernetes.io/projected/995cc232-3ec9-4d93-9d5c-b77cbca875ec-kube-api-access-694mb\") pod \"kube-auth-proxy-577868f455-6t5sg\" (UID: \"995cc232-3ec9-4d93-9d5c-b77cbca875ec\") " pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.607900 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.607866 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" Apr 16 23:32:26.728112 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:26.728087 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-577868f455-6t5sg"] Apr 16 23:32:26.730375 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:32:26.730347 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod995cc232_3ec9_4d93_9d5c_b77cbca875ec.slice/crio-3d4cac5a7e24abc968d019a557dfd5cd9301457c33d6d947a7301847bc66f1d7 WatchSource:0}: Error finding container 3d4cac5a7e24abc968d019a557dfd5cd9301457c33d6d947a7301847bc66f1d7: Status 404 returned error can't find the container with id 3d4cac5a7e24abc968d019a557dfd5cd9301457c33d6d947a7301847bc66f1d7 Apr 16 23:32:27.085377 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:27.085293 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" event={"ID":"995cc232-3ec9-4d93-9d5c-b77cbca875ec","Type":"ContainerStarted","Data":"3d4cac5a7e24abc968d019a557dfd5cd9301457c33d6d947a7301847bc66f1d7"} Apr 16 23:32:29.776860 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.776826 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-phsfh"] Apr 16 23:32:29.780243 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.780222 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:29.782570 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.782545 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 23:32:29.782718 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.782611 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-j88z7\"" Apr 16 23:32:29.786818 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.786794 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-phsfh"] Apr 16 23:32:29.844631 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.844582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert\") pod \"odh-model-controller-858dbf95b8-phsfh\" (UID: \"7e21b3a7-9358-46a5-bb91-9f52f04115e3\") " pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:29.844631 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.844632 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqv4c\" (UniqueName: \"kubernetes.io/projected/7e21b3a7-9358-46a5-bb91-9f52f04115e3-kube-api-access-mqv4c\") pod \"odh-model-controller-858dbf95b8-phsfh\" (UID: \"7e21b3a7-9358-46a5-bb91-9f52f04115e3\") " pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:29.945178 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.945148 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert\") pod \"odh-model-controller-858dbf95b8-phsfh\" (UID: \"7e21b3a7-9358-46a5-bb91-9f52f04115e3\") " pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:29.945298 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.945196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqv4c\" (UniqueName: \"kubernetes.io/projected/7e21b3a7-9358-46a5-bb91-9f52f04115e3-kube-api-access-mqv4c\") pod \"odh-model-controller-858dbf95b8-phsfh\" (UID: \"7e21b3a7-9358-46a5-bb91-9f52f04115e3\") " pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:29.945477 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:29.945341 2569 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 23:32:29.945477 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:29.945462 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert podName:7e21b3a7-9358-46a5-bb91-9f52f04115e3 nodeName:}" failed. No retries permitted until 2026-04-16 23:32:30.44541962 +0000 UTC m=+411.047213820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert") pod "odh-model-controller-858dbf95b8-phsfh" (UID: "7e21b3a7-9358-46a5-bb91-9f52f04115e3") : secret "odh-model-controller-webhook-cert" not found Apr 16 23:32:29.956175 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:29.956155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqv4c\" (UniqueName: \"kubernetes.io/projected/7e21b3a7-9358-46a5-bb91-9f52f04115e3-kube-api-access-mqv4c\") pod \"odh-model-controller-858dbf95b8-phsfh\" (UID: \"7e21b3a7-9358-46a5-bb91-9f52f04115e3\") " pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:30.097792 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:30.097718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" event={"ID":"995cc232-3ec9-4d93-9d5c-b77cbca875ec","Type":"ContainerStarted","Data":"58292fa3a709fc763046e2622cb28153c7d50c0a6baeb64fd911a3c6c0eff00c"} Apr 16 23:32:30.112925 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:30.112882 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-577868f455-6t5sg" podStartSLOduration=0.930386068 podStartE2EDuration="4.112869099s" podCreationTimestamp="2026-04-16 23:32:26 +0000 UTC" firstStartedPulling="2026-04-16 23:32:26.732203417 +0000 UTC m=+407.333997601" lastFinishedPulling="2026-04-16 23:32:29.914686442 +0000 UTC m=+410.516480632" observedRunningTime="2026-04-16 23:32:30.111436754 +0000 UTC m=+410.713230960" watchObservedRunningTime="2026-04-16 23:32:30.112869099 +0000 UTC m=+410.714663320" Apr 16 23:32:30.448586 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:30.448547 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert\") pod \"odh-model-controller-858dbf95b8-phsfh\" (UID: \"7e21b3a7-9358-46a5-bb91-9f52f04115e3\") " pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:30.448766 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:30.448642 2569 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 23:32:30.448766 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:30.448707 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert podName:7e21b3a7-9358-46a5-bb91-9f52f04115e3 nodeName:}" failed. No retries permitted until 2026-04-16 23:32:31.448691442 +0000 UTC m=+412.050485625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert") pod "odh-model-controller-858dbf95b8-phsfh" (UID: "7e21b3a7-9358-46a5-bb91-9f52f04115e3") : secret "odh-model-controller-webhook-cert" not found Apr 16 23:32:31.456375 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:31.456340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert\") pod \"odh-model-controller-858dbf95b8-phsfh\" (UID: \"7e21b3a7-9358-46a5-bb91-9f52f04115e3\") " pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:31.458679 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:31.458650 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e21b3a7-9358-46a5-bb91-9f52f04115e3-cert\") pod \"odh-model-controller-858dbf95b8-phsfh\" (UID: \"7e21b3a7-9358-46a5-bb91-9f52f04115e3\") " pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:31.593846 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:31.593809 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:31.709628 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:31.709560 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-phsfh"] Apr 16 23:32:31.712615 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:32:31.712582 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e21b3a7_9358_46a5_bb91_9f52f04115e3.slice/crio-9fac84f4141c613ff80d6d610f4148c7aaa83bb9fc1f2764c52fb380f2e53ff2 WatchSource:0}: Error finding container 9fac84f4141c613ff80d6d610f4148c7aaa83bb9fc1f2764c52fb380f2e53ff2: Status 404 returned error can't find the container with id 9fac84f4141c613ff80d6d610f4148c7aaa83bb9fc1f2764c52fb380f2e53ff2 Apr 16 23:32:32.103540 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:32.103511 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" event={"ID":"7e21b3a7-9358-46a5-bb91-9f52f04115e3","Type":"ContainerStarted","Data":"9fac84f4141c613ff80d6d610f4148c7aaa83bb9fc1f2764c52fb380f2e53ff2"} Apr 16 23:32:35.113132 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.113101 2569 generic.go:358] "Generic (PLEG): container finished" podID="7e21b3a7-9358-46a5-bb91-9f52f04115e3" containerID="d4205d1e4b2352bc258cfe473e9b43a140ce7054b6968540a8c7ee415a644a2c" exitCode=1 Apr 16 23:32:35.113545 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.113176 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" event={"ID":"7e21b3a7-9358-46a5-bb91-9f52f04115e3","Type":"ContainerDied","Data":"d4205d1e4b2352bc258cfe473e9b43a140ce7054b6968540a8c7ee415a644a2c"} Apr 16 23:32:35.113545 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.113351 2569 scope.go:117] "RemoveContainer" containerID="d4205d1e4b2352bc258cfe473e9b43a140ce7054b6968540a8c7ee415a644a2c" Apr 16 23:32:35.845558 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.845480 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rc7c4"] Apr 16 23:32:35.848538 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.848518 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:35.850847 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.850828 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 23:32:35.850999 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.850954 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-xw5x8\"" Apr 16 23:32:35.857088 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.857067 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rc7c4"] Apr 16 23:32:35.886562 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.886524 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rkf\" (UniqueName: \"kubernetes.io/projected/72a63abb-7588-45a5-8a4a-bb33634f216a-kube-api-access-r4rkf\") pod \"kserve-controller-manager-856948b99f-rc7c4\" (UID: \"72a63abb-7588-45a5-8a4a-bb33634f216a\") " pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:35.886715 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.886619 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a63abb-7588-45a5-8a4a-bb33634f216a-cert\") pod \"kserve-controller-manager-856948b99f-rc7c4\" (UID: \"72a63abb-7588-45a5-8a4a-bb33634f216a\") " pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:35.987609 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.987577 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rkf\" (UniqueName: \"kubernetes.io/projected/72a63abb-7588-45a5-8a4a-bb33634f216a-kube-api-access-r4rkf\") pod \"kserve-controller-manager-856948b99f-rc7c4\" (UID: \"72a63abb-7588-45a5-8a4a-bb33634f216a\") " pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:35.987765 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:35.987632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a63abb-7588-45a5-8a4a-bb33634f216a-cert\") pod \"kserve-controller-manager-856948b99f-rc7c4\" (UID: \"72a63abb-7588-45a5-8a4a-bb33634f216a\") " pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:35.987765 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:35.987726 2569 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 23:32:35.987834 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:35.987786 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a63abb-7588-45a5-8a4a-bb33634f216a-cert podName:72a63abb-7588-45a5-8a4a-bb33634f216a nodeName:}" failed. No retries permitted until 2026-04-16 23:32:36.487768993 +0000 UTC m=+417.089563178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72a63abb-7588-45a5-8a4a-bb33634f216a-cert") pod "kserve-controller-manager-856948b99f-rc7c4" (UID: "72a63abb-7588-45a5-8a4a-bb33634f216a") : secret "kserve-webhook-server-cert" not found Apr 16 23:32:36.007703 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.007670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rkf\" (UniqueName: \"kubernetes.io/projected/72a63abb-7588-45a5-8a4a-bb33634f216a-kube-api-access-r4rkf\") pod \"kserve-controller-manager-856948b99f-rc7c4\" (UID: \"72a63abb-7588-45a5-8a4a-bb33634f216a\") " pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:36.117184 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.117151 2569 generic.go:358] "Generic (PLEG): container finished" podID="7e21b3a7-9358-46a5-bb91-9f52f04115e3" containerID="8dc8ddc9a43baa2cb91f2ba4f8d1d4340193056276538b71755e5d982e34a988" exitCode=1 Apr 16 23:32:36.117618 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.117219 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" event={"ID":"7e21b3a7-9358-46a5-bb91-9f52f04115e3","Type":"ContainerDied","Data":"8dc8ddc9a43baa2cb91f2ba4f8d1d4340193056276538b71755e5d982e34a988"} Apr 16 23:32:36.117618 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.117261 2569 scope.go:117] "RemoveContainer" containerID="d4205d1e4b2352bc258cfe473e9b43a140ce7054b6968540a8c7ee415a644a2c" Apr 16 23:32:36.117618 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.117438 2569 scope.go:117] "RemoveContainer" containerID="8dc8ddc9a43baa2cb91f2ba4f8d1d4340193056276538b71755e5d982e34a988" Apr 16 23:32:36.117618 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:36.117612 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-phsfh_opendatahub(7e21b3a7-9358-46a5-bb91-9f52f04115e3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" podUID="7e21b3a7-9358-46a5-bb91-9f52f04115e3" Apr 16 23:32:36.491955 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.491863 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a63abb-7588-45a5-8a4a-bb33634f216a-cert\") pod \"kserve-controller-manager-856948b99f-rc7c4\" (UID: \"72a63abb-7588-45a5-8a4a-bb33634f216a\") " pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:36.494271 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.494243 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72a63abb-7588-45a5-8a4a-bb33634f216a-cert\") pod \"kserve-controller-manager-856948b99f-rc7c4\" (UID: \"72a63abb-7588-45a5-8a4a-bb33634f216a\") " pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:36.761236 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.761158 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:36.910533 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:36.910510 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rc7c4"] Apr 16 23:32:36.912132 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:32:36.912102 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a63abb_7588_45a5_8a4a_bb33634f216a.slice/crio-d2ea2c6bc206c061fbe4ac574672e77db0404f34e1a846f085a919ec222446f6 WatchSource:0}: Error finding container d2ea2c6bc206c061fbe4ac574672e77db0404f34e1a846f085a919ec222446f6: Status 404 returned error can't find the container with id d2ea2c6bc206c061fbe4ac574672e77db0404f34e1a846f085a919ec222446f6 Apr 16 23:32:37.122014 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:37.121987 2569 scope.go:117] "RemoveContainer" containerID="8dc8ddc9a43baa2cb91f2ba4f8d1d4340193056276538b71755e5d982e34a988" Apr 16 23:32:37.122426 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:37.122196 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-phsfh_opendatahub(7e21b3a7-9358-46a5-bb91-9f52f04115e3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" podUID="7e21b3a7-9358-46a5-bb91-9f52f04115e3" Apr 16 23:32:37.122864 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:37.122843 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" event={"ID":"72a63abb-7588-45a5-8a4a-bb33634f216a","Type":"ContainerStarted","Data":"d2ea2c6bc206c061fbe4ac574672e77db0404f34e1a846f085a919ec222446f6"} Apr 16 23:32:38.958036 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:38.958001 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gqxck"] Apr 16 23:32:38.962345 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:38.962326 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:38.967809 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:38.967786 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 23:32:38.967929 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:38.967910 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 23:32:38.968016 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:38.967951 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-dmzxr\"" Apr 16 23:32:38.976543 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:38.976519 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gqxck"] Apr 16 23:32:39.010218 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:39.010190 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwh2d\" (UniqueName: \"kubernetes.io/projected/f60cd135-6d3e-47e0-8b7d-d73860cb1b10-kube-api-access-nwh2d\") pod \"servicemesh-operator3-55f49c5f94-gqxck\" (UID: \"f60cd135-6d3e-47e0-8b7d-d73860cb1b10\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:39.010371 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:39.010254 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f60cd135-6d3e-47e0-8b7d-d73860cb1b10-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gqxck\" (UID: \"f60cd135-6d3e-47e0-8b7d-d73860cb1b10\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:39.111302 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:39.111264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwh2d\" (UniqueName: \"kubernetes.io/projected/f60cd135-6d3e-47e0-8b7d-d73860cb1b10-kube-api-access-nwh2d\") pod \"servicemesh-operator3-55f49c5f94-gqxck\" (UID: \"f60cd135-6d3e-47e0-8b7d-d73860cb1b10\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:39.111471 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:39.111323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f60cd135-6d3e-47e0-8b7d-d73860cb1b10-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gqxck\" (UID: \"f60cd135-6d3e-47e0-8b7d-d73860cb1b10\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:39.114446 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:39.114420 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f60cd135-6d3e-47e0-8b7d-d73860cb1b10-operator-config\") pod \"servicemesh-operator3-55f49c5f94-gqxck\" (UID: \"f60cd135-6d3e-47e0-8b7d-d73860cb1b10\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:39.119902 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:39.119838 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwh2d\" (UniqueName: \"kubernetes.io/projected/f60cd135-6d3e-47e0-8b7d-d73860cb1b10-kube-api-access-nwh2d\") pod \"servicemesh-operator3-55f49c5f94-gqxck\" (UID: \"f60cd135-6d3e-47e0-8b7d-d73860cb1b10\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:39.272868 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:39.272841 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:39.438096 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:39.438063 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-gqxck"] Apr 16 23:32:39.441663 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:32:39.441631 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf60cd135_6d3e_47e0_8b7d_d73860cb1b10.slice/crio-7476f8ad611a518d1ae31e9f7f5a0936c378be8775aef9ed2c91eb7d40b44747 WatchSource:0}: Error finding container 7476f8ad611a518d1ae31e9f7f5a0936c378be8775aef9ed2c91eb7d40b44747: Status 404 returned error can't find the container with id 7476f8ad611a518d1ae31e9f7f5a0936c378be8775aef9ed2c91eb7d40b44747 Apr 16 23:32:40.135182 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:40.135146 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" event={"ID":"72a63abb-7588-45a5-8a4a-bb33634f216a","Type":"ContainerStarted","Data":"a68a22942b16fac3fa53dbace090fce7f93ac87f72638e67d39dd111b1fe35fe"} Apr 16 23:32:40.135625 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:40.135435 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:32:40.136437 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:40.136412 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" event={"ID":"f60cd135-6d3e-47e0-8b7d-d73860cb1b10","Type":"ContainerStarted","Data":"7476f8ad611a518d1ae31e9f7f5a0936c378be8775aef9ed2c91eb7d40b44747"} Apr 16 23:32:40.154280 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:40.154238 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" podStartSLOduration=2.870234611 podStartE2EDuration="5.154226401s" podCreationTimestamp="2026-04-16 23:32:35 +0000 UTC" firstStartedPulling="2026-04-16 23:32:36.913459558 +0000 UTC m=+417.515253743" lastFinishedPulling="2026-04-16 23:32:39.197451344 +0000 UTC m=+419.799245533" observedRunningTime="2026-04-16 23:32:40.152272686 +0000 UTC m=+420.754066889" watchObservedRunningTime="2026-04-16 23:32:40.154226401 +0000 UTC m=+420.756020648" Apr 16 23:32:41.594232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:41.594195 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:41.594579 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:41.594571 2569 scope.go:117] "RemoveContainer" containerID="8dc8ddc9a43baa2cb91f2ba4f8d1d4340193056276538b71755e5d982e34a988" Apr 16 23:32:41.594761 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:32:41.594743 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-phsfh_opendatahub(7e21b3a7-9358-46a5-bb91-9f52f04115e3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" podUID="7e21b3a7-9358-46a5-bb91-9f52f04115e3" Apr 16 23:32:47.159482 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:47.159445 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" event={"ID":"f60cd135-6d3e-47e0-8b7d-d73860cb1b10","Type":"ContainerStarted","Data":"309dbf60781a575ac08bbc11aa2a06bf1eb39b5fc2f3274d0bc609f05e298da1"} Apr 16 23:32:47.159915 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:47.159583 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:47.182520 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:47.182469 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" podStartSLOduration=2.223629923 podStartE2EDuration="9.182457104s" podCreationTimestamp="2026-04-16 23:32:38 +0000 UTC" firstStartedPulling="2026-04-16 23:32:39.444338814 +0000 UTC m=+420.046133011" lastFinishedPulling="2026-04-16 23:32:46.403165995 +0000 UTC m=+427.004960192" observedRunningTime="2026-04-16 23:32:47.180178819 +0000 UTC m=+427.781973026" watchObservedRunningTime="2026-04-16 23:32:47.182457104 +0000 UTC m=+427.784251310" Apr 16 23:32:51.594830 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:51.594783 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:51.595204 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:51.595185 2569 scope.go:117] "RemoveContainer" containerID="8dc8ddc9a43baa2cb91f2ba4f8d1d4340193056276538b71755e5d982e34a988" Apr 16 23:32:52.176147 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.176055 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" event={"ID":"7e21b3a7-9358-46a5-bb91-9f52f04115e3","Type":"ContainerStarted","Data":"590b4b0594d085acc52f7230d04e8612f4c9795099470e9074d2e5ab25a7adf6"} Apr 16 23:32:52.176314 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.176291 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:32:52.191387 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.191338 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" podStartSLOduration=3.098901751 podStartE2EDuration="23.191324163s" podCreationTimestamp="2026-04-16 23:32:29 +0000 UTC" firstStartedPulling="2026-04-16 23:32:31.713899607 +0000 UTC m=+412.315693792" lastFinishedPulling="2026-04-16 23:32:51.806322017 +0000 UTC m=+432.408116204" observedRunningTime="2026-04-16 23:32:52.191056266 +0000 UTC m=+432.792850472" watchObservedRunningTime="2026-04-16 23:32:52.191324163 +0000 UTC m=+432.793118369" Apr 16 23:32:52.539701 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.539577 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4"] Apr 16 23:32:52.542729 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.542708 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.545473 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.545448 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 23:32:52.545601 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.545452 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 23:32:52.545601 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.545563 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 23:32:52.545719 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.545684 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-f88bw\"" Apr 16 23:32:52.545821 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.545805 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 23:32:52.553441 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.553423 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4"] Apr 16 23:32:52.618031 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.617997 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.618380 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.618038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.618380 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.618064 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.618380 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.618087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzbh8\" (UniqueName: \"kubernetes.io/projected/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-kube-api-access-gzbh8\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.618380 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.618154 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.618380 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.618184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.618380 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.618203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.718525 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.718492 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.718525 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.718531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.718774 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.718564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.718774 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.718677 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.718774 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.718727 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.718774 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.718754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzbh8\" (UniqueName: \"kubernetes.io/projected/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-kube-api-access-gzbh8\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.719005 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.718835 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.719385 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.719357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.720890 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.720861 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.721108 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.721086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.721325 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.721309 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.721430 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.721411 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.727857 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.727832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzbh8\" (UniqueName: \"kubernetes.io/projected/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-kube-api-access-gzbh8\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.727953 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.727903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9ac6d6e8-434f-43b8-9161-ebc9bbd75e46-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-nrgs4\" (UID: \"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.854312 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.854277 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:52.998485 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:52.998456 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4"] Apr 16 23:32:53.000245 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:32:53.000216 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac6d6e8_434f_43b8_9161_ebc9bbd75e46.slice/crio-650f1a2b3a158483d7014bd68150920ef5432711906e60ef099f352d64822478 WatchSource:0}: Error finding container 650f1a2b3a158483d7014bd68150920ef5432711906e60ef099f352d64822478: Status 404 returned error can't find the container with id 650f1a2b3a158483d7014bd68150920ef5432711906e60ef099f352d64822478 Apr 16 23:32:53.180443 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:53.180349 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" event={"ID":"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46","Type":"ContainerStarted","Data":"650f1a2b3a158483d7014bd68150920ef5432711906e60ef099f352d64822478"} Apr 16 23:32:56.264861 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:56.264823 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 23:32:56.265175 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:56.264900 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 23:32:57.196938 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:57.196899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" event={"ID":"9ac6d6e8-434f-43b8-9161-ebc9bbd75e46","Type":"ContainerStarted","Data":"ccf0e87007674e3b4415a7811ed8cc56b2799d080ca2cb6fc9e19f7e58234888"} Apr 16 23:32:57.197199 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:57.197157 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:32:57.198808 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:57.198781 2569 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-nrgs4 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 23:32:57.198932 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:57.198840 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" podUID="9ac6d6e8-434f-43b8-9161-ebc9bbd75e46" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:32:57.225519 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:57.224879 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" podStartSLOduration=1.962984048 podStartE2EDuration="5.224861832s" podCreationTimestamp="2026-04-16 23:32:52 +0000 UTC" firstStartedPulling="2026-04-16 23:32:53.002720449 +0000 UTC m=+433.604514637" lastFinishedPulling="2026-04-16 23:32:56.264598224 +0000 UTC m=+436.866392421" observedRunningTime="2026-04-16 23:32:57.223092262 +0000 UTC m=+437.824886472" watchObservedRunningTime="2026-04-16 23:32:57.224861832 +0000 UTC m=+437.826656039" Apr 16 23:32:58.164798 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:58.164765 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-gqxck" Apr 16 23:32:58.201754 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:32:58.201722 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-nrgs4" Apr 16 23:33:03.182668 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:33:03.182637 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-phsfh" Apr 16 23:33:11.146257 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:33:11.146226 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-rc7c4" Apr 16 23:34:02.717687 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.717652 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zgfbn"] Apr 16 23:34:02.720770 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.720755 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" Apr 16 23:34:02.723513 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.723495 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-d2b59\"" Apr 16 23:34:02.724661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.724643 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 23:34:02.724736 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.724688 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 23:34:02.729815 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.729788 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zgfbn"] Apr 16 23:34:02.783942 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.783910 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s7bg\" (UniqueName: \"kubernetes.io/projected/1e2bd386-b139-4d6b-8974-2537b7b75d1f-kube-api-access-7s7bg\") pod \"authorino-operator-657f44b778-zgfbn\" (UID: \"1e2bd386-b139-4d6b-8974-2537b7b75d1f\") " pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" Apr 16 23:34:02.884342 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.884303 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7s7bg\" (UniqueName: \"kubernetes.io/projected/1e2bd386-b139-4d6b-8974-2537b7b75d1f-kube-api-access-7s7bg\") pod \"authorino-operator-657f44b778-zgfbn\" (UID: \"1e2bd386-b139-4d6b-8974-2537b7b75d1f\") " pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" Apr 16 23:34:02.896587 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:02.896556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s7bg\" (UniqueName: \"kubernetes.io/projected/1e2bd386-b139-4d6b-8974-2537b7b75d1f-kube-api-access-7s7bg\") pod \"authorino-operator-657f44b778-zgfbn\" (UID: \"1e2bd386-b139-4d6b-8974-2537b7b75d1f\") " pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" Apr 16 23:34:03.032395 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:03.032301 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" Apr 16 23:34:03.158133 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:03.158103 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zgfbn"] Apr 16 23:34:03.161179 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:34:03.161152 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e2bd386_b139_4d6b_8974_2537b7b75d1f.slice/crio-f02b9970bfebba3ce4e84ae6d89137042f650d83ac932ae02511fa5a64461712 WatchSource:0}: Error finding container f02b9970bfebba3ce4e84ae6d89137042f650d83ac932ae02511fa5a64461712: Status 404 returned error can't find the container with id f02b9970bfebba3ce4e84ae6d89137042f650d83ac932ae02511fa5a64461712 Apr 16 23:34:03.402519 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:03.402485 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" event={"ID":"1e2bd386-b139-4d6b-8974-2537b7b75d1f","Type":"ContainerStarted","Data":"f02b9970bfebba3ce4e84ae6d89137042f650d83ac932ae02511fa5a64461712"} Apr 16 23:34:05.409734 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:05.409683 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" event={"ID":"1e2bd386-b139-4d6b-8974-2537b7b75d1f","Type":"ContainerStarted","Data":"0299ab67c2e365067951e3526289820548d3bfdb1a0b8e1196ab318db08df43f"} Apr 16 23:34:05.410155 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:05.409852 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" Apr 16 23:34:05.433425 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:05.433362 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" podStartSLOduration=1.750796391 podStartE2EDuration="3.433344646s" podCreationTimestamp="2026-04-16 23:34:02 +0000 UTC" firstStartedPulling="2026-04-16 23:34:03.163151446 +0000 UTC m=+503.764945630" lastFinishedPulling="2026-04-16 23:34:04.845699689 +0000 UTC m=+505.447493885" observedRunningTime="2026-04-16 23:34:05.431548143 +0000 UTC m=+506.033342351" watchObservedRunningTime="2026-04-16 23:34:05.433344646 +0000 UTC m=+506.035138854" Apr 16 23:34:16.415069 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:16.415041 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-zgfbn" Apr 16 23:34:29.125579 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.125485 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w"] Apr 16 23:34:29.128479 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.128461 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:29.130991 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.130954 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-p2glc\"" Apr 16 23:34:29.143419 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.143394 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w"] Apr 16 23:34:29.296098 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.296057 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/706ef90c-6ac7-428a-9ac3-152018a8e48e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-fc28w\" (UID: \"706ef90c-6ac7-428a-9ac3-152018a8e48e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:29.296273 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.296124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sdpm\" (UniqueName: \"kubernetes.io/projected/706ef90c-6ac7-428a-9ac3-152018a8e48e-kube-api-access-2sdpm\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-fc28w\" (UID: \"706ef90c-6ac7-428a-9ac3-152018a8e48e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:29.397416 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.397331 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/706ef90c-6ac7-428a-9ac3-152018a8e48e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-fc28w\" (UID: \"706ef90c-6ac7-428a-9ac3-152018a8e48e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:29.397416 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.397398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sdpm\" (UniqueName: \"kubernetes.io/projected/706ef90c-6ac7-428a-9ac3-152018a8e48e-kube-api-access-2sdpm\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-fc28w\" (UID: \"706ef90c-6ac7-428a-9ac3-152018a8e48e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:29.397821 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.397802 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/706ef90c-6ac7-428a-9ac3-152018a8e48e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-fc28w\" (UID: \"706ef90c-6ac7-428a-9ac3-152018a8e48e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:29.406160 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.406139 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sdpm\" (UniqueName: \"kubernetes.io/projected/706ef90c-6ac7-428a-9ac3-152018a8e48e-kube-api-access-2sdpm\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-fc28w\" (UID: \"706ef90c-6ac7-428a-9ac3-152018a8e48e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:29.438994 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.438946 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:29.553693 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:29.553635 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w"] Apr 16 23:34:29.556634 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:34:29.556606 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod706ef90c_6ac7_428a_9ac3_152018a8e48e.slice/crio-7105c5e8fdc0bf6cd6d5e8e7e1c530e50d348c24232b464b68372cb4161e4722 WatchSource:0}: Error finding container 7105c5e8fdc0bf6cd6d5e8e7e1c530e50d348c24232b464b68372cb4161e4722: Status 404 returned error can't find the container with id 7105c5e8fdc0bf6cd6d5e8e7e1c530e50d348c24232b464b68372cb4161e4722 Apr 16 23:34:30.489719 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:30.489683 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" event={"ID":"706ef90c-6ac7-428a-9ac3-152018a8e48e","Type":"ContainerStarted","Data":"7105c5e8fdc0bf6cd6d5e8e7e1c530e50d348c24232b464b68372cb4161e4722"} Apr 16 23:34:34.505642 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:34.505607 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" event={"ID":"706ef90c-6ac7-428a-9ac3-152018a8e48e","Type":"ContainerStarted","Data":"8249a5d30a164a0508400539724f1dd095dcb1311e19863600ae20673022922e"} Apr 16 23:34:34.506128 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:34.505662 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:34:34.525725 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:34.525678 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" podStartSLOduration=1.616282515 podStartE2EDuration="5.525663994s" podCreationTimestamp="2026-04-16 23:34:29 +0000 UTC" firstStartedPulling="2026-04-16 23:34:29.558834144 +0000 UTC m=+530.160628332" lastFinishedPulling="2026-04-16 23:34:33.468215613 +0000 UTC m=+534.070009811" observedRunningTime="2026-04-16 23:34:34.523336533 +0000 UTC m=+535.125130743" watchObservedRunningTime="2026-04-16 23:34:34.525663994 +0000 UTC m=+535.127458199" Apr 16 23:34:45.511312 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:34:45.511279 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:35:07.108631 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.108592 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-qm79t"] Apr 16 23:35:07.115352 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.115329 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qm79t" Apr 16 23:35:07.117986 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.117949 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lc5nq\"" Apr 16 23:35:07.118788 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.118764 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-qm79t"] Apr 16 23:35:07.176630 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.176594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r9qx\" (UniqueName: \"kubernetes.io/projected/5e9b4acb-52df-4e0c-96a6-4590b9f30628-kube-api-access-8r9qx\") pod \"authorino-7498df8756-qm79t\" (UID: \"5e9b4acb-52df-4e0c-96a6-4590b9f30628\") " pod="kuadrant-system/authorino-7498df8756-qm79t" Apr 16 23:35:07.277146 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.277111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r9qx\" (UniqueName: \"kubernetes.io/projected/5e9b4acb-52df-4e0c-96a6-4590b9f30628-kube-api-access-8r9qx\") pod \"authorino-7498df8756-qm79t\" (UID: \"5e9b4acb-52df-4e0c-96a6-4590b9f30628\") " pod="kuadrant-system/authorino-7498df8756-qm79t" Apr 16 23:35:07.290068 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.290037 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r9qx\" (UniqueName: \"kubernetes.io/projected/5e9b4acb-52df-4e0c-96a6-4590b9f30628-kube-api-access-8r9qx\") pod \"authorino-7498df8756-qm79t\" (UID: \"5e9b4acb-52df-4e0c-96a6-4590b9f30628\") " pod="kuadrant-system/authorino-7498df8756-qm79t" Apr 16 23:35:07.425586 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.425482 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qm79t" Apr 16 23:35:07.545601 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.545571 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-qm79t"] Apr 16 23:35:07.548852 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:35:07.548823 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e9b4acb_52df_4e0c_96a6_4590b9f30628.slice/crio-6eab8f861bf0c5c904209cb2802f6b31d0ee1943ced6465a75dd8a6455336466 WatchSource:0}: Error finding container 6eab8f861bf0c5c904209cb2802f6b31d0ee1943ced6465a75dd8a6455336466: Status 404 returned error can't find the container with id 6eab8f861bf0c5c904209cb2802f6b31d0ee1943ced6465a75dd8a6455336466 Apr 16 23:35:07.612109 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:07.612076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qm79t" event={"ID":"5e9b4acb-52df-4e0c-96a6-4590b9f30628","Type":"ContainerStarted","Data":"6eab8f861bf0c5c904209cb2802f6b31d0ee1943ced6465a75dd8a6455336466"} Apr 16 23:35:11.634483 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:11.634446 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qm79t" event={"ID":"5e9b4acb-52df-4e0c-96a6-4590b9f30628","Type":"ContainerStarted","Data":"6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af"} Apr 16 23:35:11.650942 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:11.650892 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-qm79t" podStartSLOduration=1.4740491740000001 podStartE2EDuration="4.650876554s" podCreationTimestamp="2026-04-16 23:35:07 +0000 UTC" firstStartedPulling="2026-04-16 23:35:07.550075474 +0000 UTC m=+568.151869661" lastFinishedPulling="2026-04-16 23:35:10.726902854 +0000 UTC m=+571.328697041" observedRunningTime="2026-04-16 23:35:11.649990415 +0000 UTC m=+572.251784623" watchObservedRunningTime="2026-04-16 23:35:11.650876554 +0000 UTC m=+572.252670760" Apr 16 23:35:41.119487 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.119449 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-54867cf69d-vhrps"] Apr 16 23:35:41.122697 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.122680 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54867cf69d-vhrps" Apr 16 23:35:41.136577 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.136553 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-54867cf69d-vhrps"] Apr 16 23:35:41.253480 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.253446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpwcq\" (UniqueName: \"kubernetes.io/projected/5a39fdee-c392-40dc-b358-e799203774e0-kube-api-access-tpwcq\") pod \"authorino-54867cf69d-vhrps\" (UID: \"5a39fdee-c392-40dc-b358-e799203774e0\") " pod="kuadrant-system/authorino-54867cf69d-vhrps" Apr 16 23:35:41.302290 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.302251 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-54867cf69d-vhrps"] Apr 16 23:35:41.302490 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:35:41.302470 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-tpwcq], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-54867cf69d-vhrps" podUID="5a39fdee-c392-40dc-b358-e799203774e0" Apr 16 23:35:41.354143 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.354108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpwcq\" (UniqueName: \"kubernetes.io/projected/5a39fdee-c392-40dc-b358-e799203774e0-kube-api-access-tpwcq\") pod \"authorino-54867cf69d-vhrps\" (UID: \"5a39fdee-c392-40dc-b358-e799203774e0\") " pod="kuadrant-system/authorino-54867cf69d-vhrps" Apr 16 23:35:41.368573 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.368539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpwcq\" (UniqueName: \"kubernetes.io/projected/5a39fdee-c392-40dc-b358-e799203774e0-kube-api-access-tpwcq\") pod \"authorino-54867cf69d-vhrps\" (UID: \"5a39fdee-c392-40dc-b358-e799203774e0\") " pod="kuadrant-system/authorino-54867cf69d-vhrps" Apr 16 23:35:41.728071 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.728037 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54867cf69d-vhrps" Apr 16 23:35:41.732995 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.732924 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54867cf69d-vhrps" Apr 16 23:35:41.858581 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.858547 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpwcq\" (UniqueName: \"kubernetes.io/projected/5a39fdee-c392-40dc-b358-e799203774e0-kube-api-access-tpwcq\") pod \"5a39fdee-c392-40dc-b358-e799203774e0\" (UID: \"5a39fdee-c392-40dc-b358-e799203774e0\") " Apr 16 23:35:41.860696 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.860665 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a39fdee-c392-40dc-b358-e799203774e0-kube-api-access-tpwcq" (OuterVolumeSpecName: "kube-api-access-tpwcq") pod "5a39fdee-c392-40dc-b358-e799203774e0" (UID: "5a39fdee-c392-40dc-b358-e799203774e0"). InnerVolumeSpecName "kube-api-access-tpwcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:35:41.959719 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:41.959683 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tpwcq\" (UniqueName: \"kubernetes.io/projected/5a39fdee-c392-40dc-b358-e799203774e0-kube-api-access-tpwcq\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:35:42.731424 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:42.731386 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54867cf69d-vhrps" Apr 16 23:35:42.775754 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:42.775718 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-54867cf69d-vhrps"] Apr 16 23:35:42.779714 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:42.779685 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-54867cf69d-vhrps"] Apr 16 23:35:43.250441 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.250408 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-qm79t"] Apr 16 23:35:43.250613 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.250592 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-qm79t" podUID="5e9b4acb-52df-4e0c-96a6-4590b9f30628" containerName="authorino" containerID="cri-o://6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af" gracePeriod=30 Apr 16 23:35:43.490873 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.490848 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qm79t" Apr 16 23:35:43.673261 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.673229 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r9qx\" (UniqueName: \"kubernetes.io/projected/5e9b4acb-52df-4e0c-96a6-4590b9f30628-kube-api-access-8r9qx\") pod \"5e9b4acb-52df-4e0c-96a6-4590b9f30628\" (UID: \"5e9b4acb-52df-4e0c-96a6-4590b9f30628\") " Apr 16 23:35:43.675406 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.675380 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9b4acb-52df-4e0c-96a6-4590b9f30628-kube-api-access-8r9qx" (OuterVolumeSpecName: "kube-api-access-8r9qx") pod "5e9b4acb-52df-4e0c-96a6-4590b9f30628" (UID: "5e9b4acb-52df-4e0c-96a6-4590b9f30628"). InnerVolumeSpecName "kube-api-access-8r9qx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:35:43.736086 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.736047 2569 generic.go:358] "Generic (PLEG): container finished" podID="5e9b4acb-52df-4e0c-96a6-4590b9f30628" containerID="6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af" exitCode=0 Apr 16 23:35:43.736470 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.736104 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-qm79t" Apr 16 23:35:43.736470 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.736127 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qm79t" event={"ID":"5e9b4acb-52df-4e0c-96a6-4590b9f30628","Type":"ContainerDied","Data":"6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af"} Apr 16 23:35:43.736470 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.736162 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-qm79t" event={"ID":"5e9b4acb-52df-4e0c-96a6-4590b9f30628","Type":"ContainerDied","Data":"6eab8f861bf0c5c904209cb2802f6b31d0ee1943ced6465a75dd8a6455336466"} Apr 16 23:35:43.736470 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.736177 2569 scope.go:117] "RemoveContainer" containerID="6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af" Apr 16 23:35:43.744522 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.744507 2569 scope.go:117] "RemoveContainer" containerID="6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af" Apr 16 23:35:43.744844 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:35:43.744822 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af\": container with ID starting with 6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af not found: ID does not exist" containerID="6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af" Apr 16 23:35:43.744924 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.744853 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af"} err="failed to get container status \"6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af\": rpc error: code = NotFound desc = could not find container \"6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af\": container with ID starting with 6873ceeec961f041cb3ca7453c2aac7bd2371b0f634e01d2347916e6841ea7af not found: ID does not exist" Apr 16 23:35:43.755811 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.755786 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-qm79t"] Apr 16 23:35:43.759659 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.759640 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-qm79t"] Apr 16 23:35:43.774283 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.774261 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8r9qx\" (UniqueName: \"kubernetes.io/projected/5e9b4acb-52df-4e0c-96a6-4590b9f30628-kube-api-access-8r9qx\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:35:43.980749 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.980667 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a39fdee-c392-40dc-b358-e799203774e0" path="/var/lib/kubelet/pods/5a39fdee-c392-40dc-b358-e799203774e0/volumes" Apr 16 23:35:43.980996 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:35:43.980952 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9b4acb-52df-4e0c-96a6-4590b9f30628" path="/var/lib/kubelet/pods/5e9b4acb-52df-4e0c-96a6-4590b9f30628/volumes" Apr 16 23:36:44.672826 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.672787 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2"] Apr 16 23:36:44.673243 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.673126 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e9b4acb-52df-4e0c-96a6-4590b9f30628" containerName="authorino" Apr 16 23:36:44.673243 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.673139 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9b4acb-52df-4e0c-96a6-4590b9f30628" containerName="authorino" Apr 16 23:36:44.673243 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.673197 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e9b4acb-52df-4e0c-96a6-4590b9f30628" containerName="authorino" Apr 16 23:36:44.676444 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.676423 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.678896 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.678876 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 23:36:44.680201 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.680181 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 23:36:44.680290 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.680191 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 23:36:44.680290 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.680267 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-k4454\"" Apr 16 23:36:44.683935 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.683916 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2"] Apr 16 23:36:44.871448 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.871413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.871448 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.871450 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.871642 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.871473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.871642 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.871565 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.871642 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.871607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.871754 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.871655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz9m2\" (UniqueName: \"kubernetes.io/projected/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-kube-api-access-bz9m2\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972277 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972277 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bz9m2\" (UniqueName: \"kubernetes.io/projected/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-kube-api-access-bz9m2\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972470 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972470 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972320 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972470 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972357 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972592 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972471 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972661 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972805 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972776 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.972934 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.972915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.974753 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.974733 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.975235 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.975215 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.979860 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.979841 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz9m2\" (UniqueName: \"kubernetes.io/projected/8fcfc5f2-7b5a-44af-822c-6a062ec4c701-kube-api-access-bz9m2\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2\" (UID: \"8fcfc5f2-7b5a-44af-822c-6a062ec4c701\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:44.985762 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:44.985742 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:36:45.111101 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:45.111075 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2"] Apr 16 23:36:45.113721 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:36:45.113686 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fcfc5f2_7b5a_44af_822c_6a062ec4c701.slice/crio-4e817d03a0cda16538020652995838914a5899ee720551da67ca3352c53e4ef8 WatchSource:0}: Error finding container 4e817d03a0cda16538020652995838914a5899ee720551da67ca3352c53e4ef8: Status 404 returned error can't find the container with id 4e817d03a0cda16538020652995838914a5899ee720551da67ca3352c53e4ef8 Apr 16 23:36:45.949910 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:45.949875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" event={"ID":"8fcfc5f2-7b5a-44af-822c-6a062ec4c701","Type":"ContainerStarted","Data":"4e817d03a0cda16538020652995838914a5899ee720551da67ca3352c53e4ef8"} Apr 16 23:36:51.972015 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:36:51.971897 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" event={"ID":"8fcfc5f2-7b5a-44af-822c-6a062ec4c701","Type":"ContainerStarted","Data":"a98e84ecca4cd8da8128eea6e8273e6c8c74c88f37be8bdc4f0ed881b31b613b"} Apr 16 23:37:00.001584 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:00.001551 2569 generic.go:358] "Generic (PLEG): container finished" podID="8fcfc5f2-7b5a-44af-822c-6a062ec4c701" containerID="a98e84ecca4cd8da8128eea6e8273e6c8c74c88f37be8bdc4f0ed881b31b613b" exitCode=0 Apr 16 23:37:00.002062 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:00.001631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" event={"ID":"8fcfc5f2-7b5a-44af-822c-6a062ec4c701","Type":"ContainerDied","Data":"a98e84ecca4cd8da8128eea6e8273e6c8c74c88f37be8bdc4f0ed881b31b613b"} Apr 16 23:37:00.002337 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:00.002319 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:37:04.019275 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:04.019235 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" event={"ID":"8fcfc5f2-7b5a-44af-822c-6a062ec4c701","Type":"ContainerStarted","Data":"118cf5df0469b3b3f1b0ea4d23fc19eb9c029aafb965fe141bc390d11072c26d"} Apr 16 23:37:04.019732 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:04.019466 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:37:05.270742 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.270688 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" podStartSLOduration=2.556654228 podStartE2EDuration="21.270672076s" podCreationTimestamp="2026-04-16 23:36:44 +0000 UTC" firstStartedPulling="2026-04-16 23:36:45.115393124 +0000 UTC m=+665.717187310" lastFinishedPulling="2026-04-16 23:37:03.829410957 +0000 UTC m=+684.431205158" observedRunningTime="2026-04-16 23:37:04.037763196 +0000 UTC m=+684.639557403" watchObservedRunningTime="2026-04-16 23:37:05.270672076 +0000 UTC m=+685.872466281" Apr 16 23:37:05.272232 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.272209 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf"] Apr 16 23:37:05.275837 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.275819 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.278321 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.278303 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 23:37:05.285061 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.285037 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf"] Apr 16 23:37:05.355165 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.355134 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxnm4\" (UniqueName: \"kubernetes.io/projected/a742dba7-d0d1-4911-9da8-19011bc7964b-kube-api-access-cxnm4\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.355320 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.355193 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.355320 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.355255 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.355320 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.355286 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.355427 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.355338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a742dba7-d0d1-4911-9da8-19011bc7964b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.355427 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.355372 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.456308 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.456275 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a742dba7-d0d1-4911-9da8-19011bc7964b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.456489 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.456327 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.456489 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.456359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxnm4\" (UniqueName: \"kubernetes.io/projected/a742dba7-d0d1-4911-9da8-19011bc7964b-kube-api-access-cxnm4\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.456489 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.456423 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.456489 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.456450 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.456489 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.456477 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.456923 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.456899 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.457208 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.457181 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.457283 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.457223 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.459019 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.458994 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a742dba7-d0d1-4911-9da8-19011bc7964b-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.459382 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.459357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a742dba7-d0d1-4911-9da8-19011bc7964b-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.464262 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.464242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxnm4\" (UniqueName: \"kubernetes.io/projected/a742dba7-d0d1-4911-9da8-19011bc7964b-kube-api-access-cxnm4\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf\" (UID: \"a742dba7-d0d1-4911-9da8-19011bc7964b\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.587048 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.586951 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:05.713714 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:05.713686 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf"] Apr 16 23:37:05.715389 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:37:05.715363 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda742dba7_d0d1_4911_9da8_19011bc7964b.slice/crio-f86cea30944b598e5094e42f5be8219ded4c5ec06aae44b57321ed2533fed4d6 WatchSource:0}: Error finding container f86cea30944b598e5094e42f5be8219ded4c5ec06aae44b57321ed2533fed4d6: Status 404 returned error can't find the container with id f86cea30944b598e5094e42f5be8219ded4c5ec06aae44b57321ed2533fed4d6 Apr 16 23:37:06.027329 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:06.027289 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" event={"ID":"a742dba7-d0d1-4911-9da8-19011bc7964b","Type":"ContainerStarted","Data":"92a305282a39c56893799f83a4db9a6a4ef37749b7d6a5fd44cbde55aa655fb3"} Apr 16 23:37:06.027329 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:06.027333 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" event={"ID":"a742dba7-d0d1-4911-9da8-19011bc7964b","Type":"ContainerStarted","Data":"f86cea30944b598e5094e42f5be8219ded4c5ec06aae44b57321ed2533fed4d6"} Apr 16 23:37:12.051447 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:12.051411 2569 generic.go:358] "Generic (PLEG): container finished" podID="a742dba7-d0d1-4911-9da8-19011bc7964b" containerID="92a305282a39c56893799f83a4db9a6a4ef37749b7d6a5fd44cbde55aa655fb3" exitCode=0 Apr 16 23:37:12.051806 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:12.051462 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" event={"ID":"a742dba7-d0d1-4911-9da8-19011bc7964b","Type":"ContainerDied","Data":"92a305282a39c56893799f83a4db9a6a4ef37749b7d6a5fd44cbde55aa655fb3"} Apr 16 23:37:13.056625 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:13.056588 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" event={"ID":"a742dba7-d0d1-4911-9da8-19011bc7964b","Type":"ContainerStarted","Data":"5d66144f011b82ed0893b3746ccb3ccd96628f18cb2cbaf99a3cd1a7f1b55e78"} Apr 16 23:37:13.057223 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:13.056833 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:13.074092 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:13.074041 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" podStartSLOduration=7.742720777 podStartE2EDuration="8.074027769s" podCreationTimestamp="2026-04-16 23:37:05 +0000 UTC" firstStartedPulling="2026-04-16 23:37:12.052083343 +0000 UTC m=+692.653877527" lastFinishedPulling="2026-04-16 23:37:12.38339033 +0000 UTC m=+692.985184519" observedRunningTime="2026-04-16 23:37:13.072886501 +0000 UTC m=+693.674680707" watchObservedRunningTime="2026-04-16 23:37:13.074027769 +0000 UTC m=+693.675821975" Apr 16 23:37:15.039690 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:15.039656 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2" Apr 16 23:37:24.073256 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:24.073172 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf" Apr 16 23:37:40.675694 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.675659 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2"] Apr 16 23:37:40.679377 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.679354 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.681846 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.681826 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 16 23:37:40.688581 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.688561 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2"] Apr 16 23:37:40.761187 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.761148 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.761351 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.761214 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.761351 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.761243 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.761351 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.761268 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.761351 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.761322 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.761511 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.761359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k79t8\" (UniqueName: \"kubernetes.io/projected/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-kube-api-access-k79t8\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862371 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862337 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862371 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862369 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862588 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862389 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862588 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862444 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862588 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862479 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k79t8\" (UniqueName: \"kubernetes.io/projected/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-kube-api-access-k79t8\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862588 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862859 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862837 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862925 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862878 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.862993 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.862925 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.864589 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.864569 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.864848 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.864831 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.869739 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.869714 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k79t8\" (UniqueName: \"kubernetes.io/projected/c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52-kube-api-access-k79t8\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-d46r2\" (UID: \"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:40.990226 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:40.990147 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:41.117531 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:41.117507 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2"] Apr 16 23:37:41.119740 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:37:41.119710 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8c3c5f4_a48d_4f70_9865_7c5e1bf9ef52.slice/crio-794dfbf3bc41d02add09086c7780772eb4760109dd6effff50a8bcd74ba40cda WatchSource:0}: Error finding container 794dfbf3bc41d02add09086c7780772eb4760109dd6effff50a8bcd74ba40cda: Status 404 returned error can't find the container with id 794dfbf3bc41d02add09086c7780772eb4760109dd6effff50a8bcd74ba40cda Apr 16 23:37:41.150428 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:41.150395 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" event={"ID":"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52","Type":"ContainerStarted","Data":"794dfbf3bc41d02add09086c7780772eb4760109dd6effff50a8bcd74ba40cda"} Apr 16 23:37:42.154864 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:42.154828 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" event={"ID":"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52","Type":"ContainerStarted","Data":"22481e1637c2ec2c1a3aa1db855ef500b6d55bfb5ea57c614cf70762acc1ebf7"} Apr 16 23:37:47.172726 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:47.172690 2569 generic.go:358] "Generic (PLEG): container finished" podID="c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52" containerID="22481e1637c2ec2c1a3aa1db855ef500b6d55bfb5ea57c614cf70762acc1ebf7" exitCode=0 Apr 16 23:37:47.172726 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:47.172730 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" event={"ID":"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52","Type":"ContainerDied","Data":"22481e1637c2ec2c1a3aa1db855ef500b6d55bfb5ea57c614cf70762acc1ebf7"} Apr 16 23:37:48.178173 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:48.178130 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" event={"ID":"c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52","Type":"ContainerStarted","Data":"5d76ead56573b77cc6e26cc6409d255229ea2a4e7d19801d180c66d75e0ad453"} Apr 16 23:37:48.178521 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:48.178355 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:37:48.195595 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:48.195544 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" podStartSLOduration=8.039244236 podStartE2EDuration="8.195528679s" podCreationTimestamp="2026-04-16 23:37:40 +0000 UTC" firstStartedPulling="2026-04-16 23:37:47.173302744 +0000 UTC m=+727.775096929" lastFinishedPulling="2026-04-16 23:37:47.329587185 +0000 UTC m=+727.931381372" observedRunningTime="2026-04-16 23:37:48.193938281 +0000 UTC m=+728.795732488" watchObservedRunningTime="2026-04-16 23:37:48.195528679 +0000 UTC m=+728.797322885" Apr 16 23:37:59.194572 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:37:59.194535 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-d46r2" Apr 16 23:45:00.139575 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.139492 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606385-gthck"] Apr 16 23:45:00.142804 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.142781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" Apr 16 23:45:00.145350 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.145331 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-l2dc4\"" Apr 16 23:45:00.156072 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.156048 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606385-gthck"] Apr 16 23:45:00.295180 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.295144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvxs\" (UniqueName: \"kubernetes.io/projected/3752f537-a3ef-4a14-9943-59faa1816a8c-kube-api-access-tfvxs\") pod \"maas-api-key-cleanup-29606385-gthck\" (UID: \"3752f537-a3ef-4a14-9943-59faa1816a8c\") " pod="opendatahub/maas-api-key-cleanup-29606385-gthck" Apr 16 23:45:00.395979 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.395864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvxs\" (UniqueName: \"kubernetes.io/projected/3752f537-a3ef-4a14-9943-59faa1816a8c-kube-api-access-tfvxs\") pod \"maas-api-key-cleanup-29606385-gthck\" (UID: \"3752f537-a3ef-4a14-9943-59faa1816a8c\") " pod="opendatahub/maas-api-key-cleanup-29606385-gthck" Apr 16 23:45:00.404189 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.404158 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvxs\" (UniqueName: \"kubernetes.io/projected/3752f537-a3ef-4a14-9943-59faa1816a8c-kube-api-access-tfvxs\") pod \"maas-api-key-cleanup-29606385-gthck\" (UID: \"3752f537-a3ef-4a14-9943-59faa1816a8c\") " pod="opendatahub/maas-api-key-cleanup-29606385-gthck" Apr 16 23:45:00.452749 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.452723 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" Apr 16 23:45:00.574507 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.574457 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606385-gthck"] Apr 16 23:45:00.576785 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:45:00.576760 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3752f537_a3ef_4a14_9943_59faa1816a8c.slice/crio-cf3bbbfc083013ce780608b27c0f31e92bd6f745b75458af871f7929300e72e6 WatchSource:0}: Error finding container cf3bbbfc083013ce780608b27c0f31e92bd6f745b75458af871f7929300e72e6: Status 404 returned error can't find the container with id cf3bbbfc083013ce780608b27c0f31e92bd6f745b75458af871f7929300e72e6 Apr 16 23:45:00.578955 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.578936 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:45:00.598291 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:00.598262 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" event={"ID":"3752f537-a3ef-4a14-9943-59faa1816a8c","Type":"ContainerStarted","Data":"cf3bbbfc083013ce780608b27c0f31e92bd6f745b75458af871f7929300e72e6"} Apr 16 23:45:03.609075 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:03.609028 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" event={"ID":"3752f537-a3ef-4a14-9943-59faa1816a8c","Type":"ContainerStarted","Data":"2a7dbfe6c0e7d18983f07804a10f537ab8eb819ac4345f750b99a02e03330918"} Apr 16 23:45:03.624043 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:03.623989 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" podStartSLOduration=1.8545127479999999 podStartE2EDuration="3.623947355s" podCreationTimestamp="2026-04-16 23:45:00 +0000 UTC" firstStartedPulling="2026-04-16 23:45:00.579115079 +0000 UTC m=+1161.180909266" lastFinishedPulling="2026-04-16 23:45:02.348549674 +0000 UTC m=+1162.950343873" observedRunningTime="2026-04-16 23:45:03.622692215 +0000 UTC m=+1164.224486421" watchObservedRunningTime="2026-04-16 23:45:03.623947355 +0000 UTC m=+1164.225741562" Apr 16 23:45:23.677140 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:23.677103 2569 generic.go:358] "Generic (PLEG): container finished" podID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerID="2a7dbfe6c0e7d18983f07804a10f537ab8eb819ac4345f750b99a02e03330918" exitCode=6 Apr 16 23:45:23.677140 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:23.677144 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" event={"ID":"3752f537-a3ef-4a14-9943-59faa1816a8c","Type":"ContainerDied","Data":"2a7dbfe6c0e7d18983f07804a10f537ab8eb819ac4345f750b99a02e03330918"} Apr 16 23:45:23.677533 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:23.677439 2569 scope.go:117] "RemoveContainer" containerID="2a7dbfe6c0e7d18983f07804a10f537ab8eb819ac4345f750b99a02e03330918" Apr 16 23:45:24.681638 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:24.681603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" event={"ID":"3752f537-a3ef-4a14-9943-59faa1816a8c","Type":"ContainerStarted","Data":"de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227"} Apr 16 23:45:44.756243 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:44.756206 2569 generic.go:358] "Generic (PLEG): container finished" podID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerID="de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227" exitCode=6 Apr 16 23:45:44.756745 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:44.756279 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" event={"ID":"3752f537-a3ef-4a14-9943-59faa1816a8c","Type":"ContainerDied","Data":"de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227"} Apr 16 23:45:44.756745 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:44.756325 2569 scope.go:117] "RemoveContainer" containerID="2a7dbfe6c0e7d18983f07804a10f537ab8eb819ac4345f750b99a02e03330918" Apr 16 23:45:44.756745 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:44.756674 2569 scope.go:117] "RemoveContainer" containerID="de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227" Apr 16 23:45:44.756951 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:45:44.756933 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29606385-gthck_opendatahub(3752f537-a3ef-4a14-9943-59faa1816a8c)\"" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" Apr 16 23:45:57.977218 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:57.977184 2569 scope.go:117] "RemoveContainer" containerID="de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227" Apr 16 23:45:58.807916 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:58.807881 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" event={"ID":"3752f537-a3ef-4a14-9943-59faa1816a8c","Type":"ContainerStarted","Data":"0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844"} Apr 16 23:45:59.004046 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:59.004013 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606385-gthck"] Apr 16 23:45:59.810721 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:45:59.810684 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" containerID="cri-o://0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844" gracePeriod=30 Apr 16 23:46:18.854003 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.853981 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" Apr 16 23:46:18.877757 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.877725 2569 generic.go:358] "Generic (PLEG): container finished" podID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerID="0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844" exitCode=6 Apr 16 23:46:18.877913 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.877798 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" Apr 16 23:46:18.877913 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.877793 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" event={"ID":"3752f537-a3ef-4a14-9943-59faa1816a8c","Type":"ContainerDied","Data":"0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844"} Apr 16 23:46:18.877913 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.877905 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606385-gthck" event={"ID":"3752f537-a3ef-4a14-9943-59faa1816a8c","Type":"ContainerDied","Data":"cf3bbbfc083013ce780608b27c0f31e92bd6f745b75458af871f7929300e72e6"} Apr 16 23:46:18.878089 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.877933 2569 scope.go:117] "RemoveContainer" containerID="0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844" Apr 16 23:46:18.887238 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.887218 2569 scope.go:117] "RemoveContainer" containerID="de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227" Apr 16 23:46:18.894602 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.894579 2569 scope.go:117] "RemoveContainer" containerID="0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844" Apr 16 23:46:18.894848 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:46:18.894831 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844\": container with ID starting with 0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844 not found: ID does not exist" containerID="0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844" Apr 16 23:46:18.894908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.894858 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844"} err="failed to get container status \"0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844\": rpc error: code = NotFound desc = could not find container \"0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844\": container with ID starting with 0c460b962863b67b478a973be9651ac06842d03cf881aeb3436f960e1f9e1844 not found: ID does not exist" Apr 16 23:46:18.894908 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.894876 2569 scope.go:117] "RemoveContainer" containerID="de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227" Apr 16 23:46:18.895120 ip-10-0-136-147 kubenswrapper[2569]: E0416 23:46:18.895101 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227\": container with ID starting with de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227 not found: ID does not exist" containerID="de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227" Apr 16 23:46:18.895178 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.895134 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227"} err="failed to get container status \"de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227\": rpc error: code = NotFound desc = could not find container \"de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227\": container with ID starting with de594f38148a0a35139136bbdc13c4e7a65d01e090fee1678903cdecc9d69227 not found: ID does not exist" Apr 16 23:46:18.957580 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.957518 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfvxs\" (UniqueName: \"kubernetes.io/projected/3752f537-a3ef-4a14-9943-59faa1816a8c-kube-api-access-tfvxs\") pod \"3752f537-a3ef-4a14-9943-59faa1816a8c\" (UID: \"3752f537-a3ef-4a14-9943-59faa1816a8c\") " Apr 16 23:46:18.959502 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:18.959473 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3752f537-a3ef-4a14-9943-59faa1816a8c-kube-api-access-tfvxs" (OuterVolumeSpecName: "kube-api-access-tfvxs") pod "3752f537-a3ef-4a14-9943-59faa1816a8c" (UID: "3752f537-a3ef-4a14-9943-59faa1816a8c"). InnerVolumeSpecName "kube-api-access-tfvxs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:46:19.058137 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:19.058102 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfvxs\" (UniqueName: \"kubernetes.io/projected/3752f537-a3ef-4a14-9943-59faa1816a8c-kube-api-access-tfvxs\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:46:19.199351 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:19.199324 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606385-gthck"] Apr 16 23:46:19.208647 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:19.206071 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606385-gthck"] Apr 16 23:46:19.980391 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:46:19.980357 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" path="/var/lib/kubelet/pods/3752f537-a3ef-4a14-9943-59faa1816a8c/volumes" Apr 16 23:50:03.413560 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.413481 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w"] Apr 16 23:50:03.414067 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.413753 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" podUID="706ef90c-6ac7-428a-9ac3-152018a8e48e" containerName="manager" containerID="cri-o://8249a5d30a164a0508400539724f1dd095dcb1311e19863600ae20673022922e" gracePeriod=10 Apr 16 23:50:03.616723 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.616689 2569 generic.go:358] "Generic (PLEG): container finished" podID="706ef90c-6ac7-428a-9ac3-152018a8e48e" containerID="8249a5d30a164a0508400539724f1dd095dcb1311e19863600ae20673022922e" exitCode=0 Apr 16 23:50:03.616899 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.616756 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" event={"ID":"706ef90c-6ac7-428a-9ac3-152018a8e48e","Type":"ContainerDied","Data":"8249a5d30a164a0508400539724f1dd095dcb1311e19863600ae20673022922e"} Apr 16 23:50:03.664696 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.664641 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:50:03.838614 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.838575 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/706ef90c-6ac7-428a-9ac3-152018a8e48e-extensions-socket-volume\") pod \"706ef90c-6ac7-428a-9ac3-152018a8e48e\" (UID: \"706ef90c-6ac7-428a-9ac3-152018a8e48e\") " Apr 16 23:50:03.838778 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.838636 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sdpm\" (UniqueName: \"kubernetes.io/projected/706ef90c-6ac7-428a-9ac3-152018a8e48e-kube-api-access-2sdpm\") pod \"706ef90c-6ac7-428a-9ac3-152018a8e48e\" (UID: \"706ef90c-6ac7-428a-9ac3-152018a8e48e\") " Apr 16 23:50:03.838977 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.838937 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/706ef90c-6ac7-428a-9ac3-152018a8e48e-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "706ef90c-6ac7-428a-9ac3-152018a8e48e" (UID: "706ef90c-6ac7-428a-9ac3-152018a8e48e"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:50:03.840704 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.840683 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706ef90c-6ac7-428a-9ac3-152018a8e48e-kube-api-access-2sdpm" (OuterVolumeSpecName: "kube-api-access-2sdpm") pod "706ef90c-6ac7-428a-9ac3-152018a8e48e" (UID: "706ef90c-6ac7-428a-9ac3-152018a8e48e"). InnerVolumeSpecName "kube-api-access-2sdpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:50:03.939723 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.939648 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/706ef90c-6ac7-428a-9ac3-152018a8e48e-extensions-socket-volume\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:50:03.939723 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:03.939675 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2sdpm\" (UniqueName: \"kubernetes.io/projected/706ef90c-6ac7-428a-9ac3-152018a8e48e-kube-api-access-2sdpm\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 16 23:50:04.621278 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:04.621250 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" Apr 16 23:50:04.621684 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:04.621249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w" event={"ID":"706ef90c-6ac7-428a-9ac3-152018a8e48e","Type":"ContainerDied","Data":"7105c5e8fdc0bf6cd6d5e8e7e1c530e50d348c24232b464b68372cb4161e4722"} Apr 16 23:50:04.621684 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:04.621363 2569 scope.go:117] "RemoveContainer" containerID="8249a5d30a164a0508400539724f1dd095dcb1311e19863600ae20673022922e" Apr 16 23:50:04.637699 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:04.637671 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w"] Apr 16 23:50:04.641110 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:04.641085 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-fc28w"] Apr 16 23:50:05.980411 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:50:05.980381 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706ef90c-6ac7-428a-9ac3-152018a8e48e" path="/var/lib/kubelet/pods/706ef90c-6ac7-428a-9ac3-152018a8e48e/volumes" Apr 16 23:51:09.491680 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.491641 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq"] Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.491936 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="706ef90c-6ac7-428a-9ac3-152018a8e48e" containerName="manager" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.491946 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="706ef90c-6ac7-428a-9ac3-152018a8e48e" containerName="manager" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.491974 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.491983 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.491995 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.492000 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.492010 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.492015 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.492063 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.492072 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="706ef90c-6ac7-428a-9ac3-152018a8e48e" containerName="manager" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.492079 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.492154 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.492084 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3752f537-a3ef-4a14-9943-59faa1816a8c" containerName="cleanup" Apr 16 23:51:09.494820 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.494803 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:09.497704 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.497684 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-p2glc\"" Apr 16 23:51:09.506365 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.506344 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq"] Apr 16 23:51:09.686298 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.686256 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rt4b\" (UniqueName: \"kubernetes.io/projected/28549e8a-58ab-442f-a2eb-75c1bc0915e7-kube-api-access-6rt4b\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq\" (UID: \"28549e8a-58ab-442f-a2eb-75c1bc0915e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:09.686463 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.686374 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/28549e8a-58ab-442f-a2eb-75c1bc0915e7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq\" (UID: \"28549e8a-58ab-442f-a2eb-75c1bc0915e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:09.787792 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.787710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/28549e8a-58ab-442f-a2eb-75c1bc0915e7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq\" (UID: \"28549e8a-58ab-442f-a2eb-75c1bc0915e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:09.787792 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.787767 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rt4b\" (UniqueName: \"kubernetes.io/projected/28549e8a-58ab-442f-a2eb-75c1bc0915e7-kube-api-access-6rt4b\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq\" (UID: \"28549e8a-58ab-442f-a2eb-75c1bc0915e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:09.788128 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.788108 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/28549e8a-58ab-442f-a2eb-75c1bc0915e7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq\" (UID: \"28549e8a-58ab-442f-a2eb-75c1bc0915e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:09.799814 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.799785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rt4b\" (UniqueName: \"kubernetes.io/projected/28549e8a-58ab-442f-a2eb-75c1bc0915e7-kube-api-access-6rt4b\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq\" (UID: \"28549e8a-58ab-442f-a2eb-75c1bc0915e7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:09.804635 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.804613 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:09.926040 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.926017 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq"] Apr 16 23:51:09.929067 ip-10-0-136-147 kubenswrapper[2569]: W0416 23:51:09.929041 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28549e8a_58ab_442f_a2eb_75c1bc0915e7.slice/crio-3ef7a2b24e5134524f22e1f3f88f9a2a318ffbc93b126452802925622cca4b07 WatchSource:0}: Error finding container 3ef7a2b24e5134524f22e1f3f88f9a2a318ffbc93b126452802925622cca4b07: Status 404 returned error can't find the container with id 3ef7a2b24e5134524f22e1f3f88f9a2a318ffbc93b126452802925622cca4b07 Apr 16 23:51:09.931879 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:09.931865 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:51:10.828475 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:10.828436 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" event={"ID":"28549e8a-58ab-442f-a2eb-75c1bc0915e7","Type":"ContainerStarted","Data":"5764b9ffb9de13c7ac7057e24b29cc5294a7340baf0dd9295390ff8da44a7422"} Apr 16 23:51:10.828475 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:10.828471 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" event={"ID":"28549e8a-58ab-442f-a2eb-75c1bc0915e7","Type":"ContainerStarted","Data":"3ef7a2b24e5134524f22e1f3f88f9a2a318ffbc93b126452802925622cca4b07"} Apr 16 23:51:10.828900 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:10.828558 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 16 23:51:10.850309 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:10.850262 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" podStartSLOduration=1.850245372 podStartE2EDuration="1.850245372s" podCreationTimestamp="2026-04-16 23:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:51:10.848287817 +0000 UTC m=+1531.450082031" watchObservedRunningTime="2026-04-16 23:51:10.850245372 +0000 UTC m=+1531.452039577" Apr 16 23:51:21.836067 ip-10-0-136-147 kubenswrapper[2569]: I0416 23:51:21.836033 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq" Apr 17 00:00:00.128921 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.128843 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-hdwlc"] Apr 17 00:00:00.132462 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.132436 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" Apr 17 00:00:00.134837 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.134817 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-l2dc4\"" Apr 17 00:00:00.148051 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.148024 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-hdwlc"] Apr 17 00:00:00.219028 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.218986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-582sh\" (UniqueName: \"kubernetes.io/projected/a18c8650-8e45-4963-8fa9-aa8a5988d2c0-kube-api-access-582sh\") pod \"maas-api-key-cleanup-29606400-hdwlc\" (UID: \"a18c8650-8e45-4963-8fa9-aa8a5988d2c0\") " pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" Apr 17 00:00:00.320158 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.320127 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-582sh\" (UniqueName: \"kubernetes.io/projected/a18c8650-8e45-4963-8fa9-aa8a5988d2c0-kube-api-access-582sh\") pod \"maas-api-key-cleanup-29606400-hdwlc\" (UID: \"a18c8650-8e45-4963-8fa9-aa8a5988d2c0\") " pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" Apr 17 00:00:00.328751 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.328722 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-582sh\" (UniqueName: \"kubernetes.io/projected/a18c8650-8e45-4963-8fa9-aa8a5988d2c0-kube-api-access-582sh\") pod \"maas-api-key-cleanup-29606400-hdwlc\" (UID: \"a18c8650-8e45-4963-8fa9-aa8a5988d2c0\") " pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" Apr 17 00:00:00.457382 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.457296 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" Apr 17 00:00:00.578728 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.578696 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-hdwlc"] Apr 17 00:00:00.581410 ip-10-0-136-147 kubenswrapper[2569]: W0417 00:00:00.581384 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda18c8650_8e45_4963_8fa9_aa8a5988d2c0.slice/crio-bccf3ed28be8547655d186edd982b9c90f9c5b3f2df5f7d37b56d6ad6bb0d663 WatchSource:0}: Error finding container bccf3ed28be8547655d186edd982b9c90f9c5b3f2df5f7d37b56d6ad6bb0d663: Status 404 returned error can't find the container with id bccf3ed28be8547655d186edd982b9c90f9c5b3f2df5f7d37b56d6ad6bb0d663 Apr 17 00:00:00.583523 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:00.583506 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:00:01.541071 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:01.541033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" event={"ID":"a18c8650-8e45-4963-8fa9-aa8a5988d2c0","Type":"ContainerStarted","Data":"55d4f1faa7bc2e0383ad4fe80e06f6a1be46be02aa2d16ecb8acb4a6d63a6222"} Apr 17 00:00:01.541071 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:01.541076 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" event={"ID":"a18c8650-8e45-4963-8fa9-aa8a5988d2c0","Type":"ContainerStarted","Data":"bccf3ed28be8547655d186edd982b9c90f9c5b3f2df5f7d37b56d6ad6bb0d663"} Apr 17 00:00:01.567655 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:01.567598 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" podStartSLOduration=1.5675791700000001 podStartE2EDuration="1.56757917s" podCreationTimestamp="2026-04-17 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 00:00:01.566539227 +0000 UTC m=+2062.168333432" watchObservedRunningTime="2026-04-17 00:00:01.56757917 +0000 UTC m=+2062.169373377" Apr 17 00:00:21.612270 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:21.612235 2569 generic.go:358] "Generic (PLEG): container finished" podID="a18c8650-8e45-4963-8fa9-aa8a5988d2c0" containerID="55d4f1faa7bc2e0383ad4fe80e06f6a1be46be02aa2d16ecb8acb4a6d63a6222" exitCode=6 Apr 17 00:00:21.612673 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:21.612309 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" event={"ID":"a18c8650-8e45-4963-8fa9-aa8a5988d2c0","Type":"ContainerDied","Data":"55d4f1faa7bc2e0383ad4fe80e06f6a1be46be02aa2d16ecb8acb4a6d63a6222"} Apr 17 00:00:21.612673 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:21.612646 2569 scope.go:117] "RemoveContainer" containerID="55d4f1faa7bc2e0383ad4fe80e06f6a1be46be02aa2d16ecb8acb4a6d63a6222" Apr 17 00:00:22.616559 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:22.616523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" event={"ID":"a18c8650-8e45-4963-8fa9-aa8a5988d2c0","Type":"ContainerStarted","Data":"2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b"} Apr 17 00:00:42.700208 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:42.700110 2569 generic.go:358] "Generic (PLEG): container finished" podID="a18c8650-8e45-4963-8fa9-aa8a5988d2c0" containerID="2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b" exitCode=6 Apr 17 00:00:42.700208 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:42.700163 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" event={"ID":"a18c8650-8e45-4963-8fa9-aa8a5988d2c0","Type":"ContainerDied","Data":"2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b"} Apr 17 00:00:42.700208 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:42.700199 2569 scope.go:117] "RemoveContainer" containerID="55d4f1faa7bc2e0383ad4fe80e06f6a1be46be02aa2d16ecb8acb4a6d63a6222" Apr 17 00:00:42.700719 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:42.700513 2569 scope.go:117] "RemoveContainer" containerID="2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b" Apr 17 00:00:42.700764 ip-10-0-136-147 kubenswrapper[2569]: E0417 00:00:42.700721 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29606400-hdwlc_opendatahub(a18c8650-8e45-4963-8fa9-aa8a5988d2c0)\"" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" podUID="a18c8650-8e45-4963-8fa9-aa8a5988d2c0" Apr 17 00:00:47.364980 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:47.364917 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-rc7c4_72a63abb-7588-45a5-8a4a-bb33634f216a/manager/0.log" Apr 17 00:00:47.596822 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:47.596790 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29606400-hdwlc_a18c8650-8e45-4963-8fa9-aa8a5988d2c0/cleanup/1.log" Apr 17 00:00:47.818106 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:47.818025 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-phsfh_7e21b3a7-9358-46a5-bb91-9f52f04115e3/manager/2.log" Apr 17 00:00:48.032278 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:48.032242 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8bf69b96d-sj6lz_bbc0caec-b3c4-437a-a8cd-00e91951b67f/manager/0.log" Apr 17 00:00:49.579973 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:49.579928 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-zgfbn_1e2bd386-b139-4d6b-8974-2537b7b75d1f/manager/0.log" Apr 17 00:00:50.006653 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:50.006624 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq_28549e8a-58ab-442f-a2eb-75c1bc0915e7/manager/0.log" Apr 17 00:00:50.660245 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:50.660211 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-nrgs4_9ac6d6e8-434f-43b8-9161-ebc9bbd75e46/discovery/0.log" Apr 17 00:00:50.760281 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:50.760247 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-577868f455-6t5sg_995cc232-3ec9-4d93-9d5c-b77cbca875ec/kube-auth-proxy/0.log" Apr 17 00:00:51.424032 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:51.424002 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf_a742dba7-d0d1-4911-9da8-19011bc7964b/storage-initializer/0.log" Apr 17 00:00:51.445381 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:51.445359 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-4ztwf_a742dba7-d0d1-4911-9da8-19011bc7964b/main/0.log" Apr 17 00:00:51.660176 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:51.660148 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-d46r2_c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52/storage-initializer/0.log" Apr 17 00:00:51.666446 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:51.666424 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-d46r2_c8c3c5f4-a48d-4f70-9865-7c5e1bf9ef52/main/0.log" Apr 17 00:00:51.990049 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:51.990019 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2_8fcfc5f2-7b5a-44af-822c-6a062ec4c701/storage-initializer/0.log" Apr 17 00:00:51.998286 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:51.998266 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-vcxd2_8fcfc5f2-7b5a-44af-822c-6a062ec4c701/main/0.log" Apr 17 00:00:57.976836 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:57.976801 2569 scope.go:117] "RemoveContainer" containerID="2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b" Apr 17 00:00:58.756407 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:58.756368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" event={"ID":"a18c8650-8e45-4963-8fa9-aa8a5988d2c0","Type":"ContainerStarted","Data":"5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee"} Apr 17 00:00:59.002555 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:59.002520 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-hdwlc"] Apr 17 00:00:59.482060 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:59.482036 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lr65x_11bac137-baa0-441f-9af0-85cedda59681/global-pull-secret-syncer/0.log" Apr 17 00:00:59.588992 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:59.588945 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-c8wlz_f7cbe5f0-8f20-44e0-b2d5-f50ce28222f0/konnectivity-agent/0.log" Apr 17 00:00:59.699848 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:59.699817 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-147.ec2.internal_a191383db2d48d2a4a230e7ba22ce803/haproxy/0.log" Apr 17 00:00:59.759830 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:00:59.759746 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" podUID="a18c8650-8e45-4963-8fa9-aa8a5988d2c0" containerName="cleanup" containerID="cri-o://5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee" gracePeriod=30 Apr 17 00:01:04.310038 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:04.310003 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-zgfbn_1e2bd386-b139-4d6b-8974-2537b7b75d1f/manager/0.log" Apr 17 00:01:04.450068 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:04.450031 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-xd7dq_28549e8a-58ab-442f-a2eb-75c1bc0915e7/manager/0.log" Apr 17 00:01:06.134073 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:06.134045 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2h6sb_f87142d4-7511-4ad7-8599-27eb607fb332/node-exporter/0.log" Apr 17 00:01:06.154103 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:06.154080 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2h6sb_f87142d4-7511-4ad7-8599-27eb607fb332/kube-rbac-proxy/0.log" Apr 17 00:01:06.174893 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:06.174874 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2h6sb_f87142d4-7511-4ad7-8599-27eb607fb332/init-textfile/0.log" Apr 17 00:01:07.819982 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:07.819938 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-4c6rh_a35ffddb-daf5-4419-8950-a0f8be5ddeba/networking-console-plugin/0.log" Apr 17 00:01:08.893485 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:08.893455 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj"] Apr 17 00:01:08.896500 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:08.896484 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:08.898930 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:08.898906 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zs59b\"/\"openshift-service-ca.crt\"" Apr 17 00:01:08.900049 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:08.900026 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zs59b\"/\"default-dockercfg-rtvn7\"" Apr 17 00:01:08.900149 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:08.900090 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zs59b\"/\"kube-root-ca.crt\"" Apr 17 00:01:08.902769 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:08.902734 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj"] Apr 17 00:01:09.003383 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.003348 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-proc\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.003383 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.003392 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-sys\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.003593 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.003457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-lib-modules\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.003593 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.003505 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vkkr\" (UniqueName: \"kubernetes.io/projected/1f77190e-5983-4308-84c7-bfefbd17d581-kube-api-access-9vkkr\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.003593 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.003539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-podres\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104687 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104621 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-proc\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104687 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-sys\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104939 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-lib-modules\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104939 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vkkr\" (UniqueName: \"kubernetes.io/projected/1f77190e-5983-4308-84c7-bfefbd17d581-kube-api-access-9vkkr\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104939 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104757 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-proc\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104939 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-podres\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104939 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104813 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-sys\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104939 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104877 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-podres\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.104939 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.104887 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1f77190e-5983-4308-84c7-bfefbd17d581-lib-modules\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.112077 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.112058 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vkkr\" (UniqueName: \"kubernetes.io/projected/1f77190e-5983-4308-84c7-bfefbd17d581-kube-api-access-9vkkr\") pod \"perf-node-gather-daemonset-ms9bj\" (UID: \"1f77190e-5983-4308-84c7-bfefbd17d581\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.207799 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.207713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.322730 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.322563 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj"] Apr 17 00:01:09.325340 ip-10-0-136-147 kubenswrapper[2569]: W0417 00:01:09.325313 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1f77190e_5983_4308_84c7_bfefbd17d581.slice/crio-f8a365d78729298617838dcba5bcd262d0d4791558784cc69d18469b9ebf4589 WatchSource:0}: Error finding container f8a365d78729298617838dcba5bcd262d0d4791558784cc69d18469b9ebf4589: Status 404 returned error can't find the container with id f8a365d78729298617838dcba5bcd262d0d4791558784cc69d18469b9ebf4589 Apr 17 00:01:09.796289 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.796249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" event={"ID":"1f77190e-5983-4308-84c7-bfefbd17d581","Type":"ContainerStarted","Data":"ecc3a5343446677c235611675a1522e21d97e4dcc9d0454e0aff9d764d791641"} Apr 17 00:01:09.796289 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.796287 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" event={"ID":"1f77190e-5983-4308-84c7-bfefbd17d581","Type":"ContainerStarted","Data":"f8a365d78729298617838dcba5bcd262d0d4791558784cc69d18469b9ebf4589"} Apr 17 00:01:09.796538 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.796358 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:09.812746 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:09.812698 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" podStartSLOduration=1.812685176 podStartE2EDuration="1.812685176s" podCreationTimestamp="2026-04-17 00:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 00:01:09.811120671 +0000 UTC m=+2130.412914890" watchObservedRunningTime="2026-04-17 00:01:09.812685176 +0000 UTC m=+2130.414479381" Apr 17 00:01:10.111006 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:10.110979 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9jbxk_521d672a-517e-4c54-a3c8-a1af436fb79c/dns/0.log" Apr 17 00:01:10.131412 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:10.131388 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9jbxk_521d672a-517e-4c54-a3c8-a1af436fb79c/kube-rbac-proxy/0.log" Apr 17 00:01:10.217290 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:10.217264 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5bhhr_1b15b520-d13a-41b6-b06f-81365371c0a0/dns-node-resolver/0.log" Apr 17 00:01:10.716897 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:10.716859 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cw7zr_621008c7-04c4-43b9-87b0-8ba9013bdecc/node-ca/0.log" Apr 17 00:01:11.651254 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:11.651219 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-nrgs4_9ac6d6e8-434f-43b8-9161-ebc9bbd75e46/discovery/0.log" Apr 17 00:01:11.670138 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:11.670112 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-577868f455-6t5sg_995cc232-3ec9-4d93-9d5c-b77cbca875ec/kube-auth-proxy/0.log" Apr 17 00:01:12.238186 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:12.238151 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2gmq5_3711a978-4a77-4055-9046-6ebb4a5ffeb1/serve-healthcheck-canary/0.log" Apr 17 00:01:12.753796 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:12.753765 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cw66v_6a0fd6cc-6997-498d-84cd-310d912c1372/kube-rbac-proxy/0.log" Apr 17 00:01:12.775641 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:12.775618 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cw66v_6a0fd6cc-6997-498d-84cd-310d912c1372/exporter/0.log" Apr 17 00:01:12.797335 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:12.797312 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cw66v_6a0fd6cc-6997-498d-84cd-310d912c1372/extractor/0.log" Apr 17 00:01:14.793296 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:14.793266 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-rc7c4_72a63abb-7588-45a5-8a4a-bb33634f216a/manager/0.log" Apr 17 00:01:14.843075 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:14.843045 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29606400-hdwlc_a18c8650-8e45-4963-8fa9-aa8a5988d2c0/cleanup/1.log" Apr 17 00:01:14.843251 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:14.843156 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-key-cleanup-29606400-hdwlc_a18c8650-8e45-4963-8fa9-aa8a5988d2c0/cleanup/2.log" Apr 17 00:01:14.912213 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:14.912178 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-phsfh_7e21b3a7-9358-46a5-bb91-9f52f04115e3/manager/1.log" Apr 17 00:01:14.922651 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:14.922621 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-phsfh_7e21b3a7-9358-46a5-bb91-9f52f04115e3/manager/2.log" Apr 17 00:01:14.980544 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:14.980516 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8bf69b96d-sj6lz_bbc0caec-b3c4-437a-a8cd-00e91951b67f/manager/0.log" Apr 17 00:01:15.809353 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:15.809324 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-ms9bj" Apr 17 00:01:16.133098 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:16.133055 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-65rd7_0476080a-4f5d-4f99-8ca4-ef93b1169b46/openshift-lws-operator/0.log" Apr 17 00:01:18.804777 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.804753 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" Apr 17 00:01:18.827364 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.827333 2569 generic.go:358] "Generic (PLEG): container finished" podID="a18c8650-8e45-4963-8fa9-aa8a5988d2c0" containerID="5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee" exitCode=6 Apr 17 00:01:18.827507 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.827374 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" event={"ID":"a18c8650-8e45-4963-8fa9-aa8a5988d2c0","Type":"ContainerDied","Data":"5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee"} Apr 17 00:01:18.827507 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.827407 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" event={"ID":"a18c8650-8e45-4963-8fa9-aa8a5988d2c0","Type":"ContainerDied","Data":"bccf3ed28be8547655d186edd982b9c90f9c5b3f2df5f7d37b56d6ad6bb0d663"} Apr 17 00:01:18.827507 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.827421 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606400-hdwlc" Apr 17 00:01:18.827507 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.827432 2569 scope.go:117] "RemoveContainer" containerID="5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee" Apr 17 00:01:18.837656 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.837634 2569 scope.go:117] "RemoveContainer" containerID="2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b" Apr 17 00:01:18.845390 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.845374 2569 scope.go:117] "RemoveContainer" containerID="5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee" Apr 17 00:01:18.845729 ip-10-0-136-147 kubenswrapper[2569]: E0417 00:01:18.845704 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee\": container with ID starting with 5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee not found: ID does not exist" containerID="5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee" Apr 17 00:01:18.845806 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.845741 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee"} err="failed to get container status \"5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee\": rpc error: code = NotFound desc = could not find container \"5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee\": container with ID starting with 5ccaaabdd2adae050c8bedcdb477f5bb286948ddeecf2122324250171c20feee not found: ID does not exist" Apr 17 00:01:18.845806 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.845769 2569 scope.go:117] "RemoveContainer" containerID="2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b" Apr 17 00:01:18.846119 ip-10-0-136-147 kubenswrapper[2569]: E0417 00:01:18.846097 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b\": container with ID starting with 2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b not found: ID does not exist" containerID="2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b" Apr 17 00:01:18.846192 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.846126 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b"} err="failed to get container status \"2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b\": rpc error: code = NotFound desc = could not find container \"2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b\": container with ID starting with 2042120496fccf6d0135e565029a4ce28eb91dafde198031f4a5cb7824c5db7b not found: ID does not exist" Apr 17 00:01:18.883550 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.883527 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-582sh\" (UniqueName: \"kubernetes.io/projected/a18c8650-8e45-4963-8fa9-aa8a5988d2c0-kube-api-access-582sh\") pod \"a18c8650-8e45-4963-8fa9-aa8a5988d2c0\" (UID: \"a18c8650-8e45-4963-8fa9-aa8a5988d2c0\") " Apr 17 00:01:18.885387 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.885365 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18c8650-8e45-4963-8fa9-aa8a5988d2c0-kube-api-access-582sh" (OuterVolumeSpecName: "kube-api-access-582sh") pod "a18c8650-8e45-4963-8fa9-aa8a5988d2c0" (UID: "a18c8650-8e45-4963-8fa9-aa8a5988d2c0"). InnerVolumeSpecName "kube-api-access-582sh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:01:18.984452 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:18.984372 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-582sh\" (UniqueName: \"kubernetes.io/projected/a18c8650-8e45-4963-8fa9-aa8a5988d2c0-kube-api-access-582sh\") on node \"ip-10-0-136-147.ec2.internal\" DevicePath \"\"" Apr 17 00:01:19.150985 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:19.148706 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-hdwlc"] Apr 17 00:01:19.152716 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:19.152685 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606400-hdwlc"] Apr 17 00:01:19.981335 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:19.981257 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18c8650-8e45-4963-8fa9-aa8a5988d2c0" path="/var/lib/kubelet/pods/a18c8650-8e45-4963-8fa9-aa8a5988d2c0/volumes" Apr 17 00:01:21.822682 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:21.822644 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8t8tf_be198f06-cdcd-4d45-84cb-08bf655ad486/kube-multus-additional-cni-plugins/0.log" Apr 17 00:01:21.843972 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:21.843934 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8t8tf_be198f06-cdcd-4d45-84cb-08bf655ad486/egress-router-binary-copy/0.log" Apr 17 00:01:21.866613 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:21.866593 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8t8tf_be198f06-cdcd-4d45-84cb-08bf655ad486/cni-plugins/0.log" Apr 17 00:01:21.887698 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:21.887676 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8t8tf_be198f06-cdcd-4d45-84cb-08bf655ad486/bond-cni-plugin/0.log" Apr 17 00:01:21.926596 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:21.926576 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8t8tf_be198f06-cdcd-4d45-84cb-08bf655ad486/routeoverride-cni/0.log" Apr 17 00:01:21.982926 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:21.982901 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8t8tf_be198f06-cdcd-4d45-84cb-08bf655ad486/whereabouts-cni-bincopy/0.log" Apr 17 00:01:22.046641 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:22.046616 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8t8tf_be198f06-cdcd-4d45-84cb-08bf655ad486/whereabouts-cni/0.log" Apr 17 00:01:22.397015 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:22.396985 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vhf6l_57932062-ba33-4931-9e05-3612d8392b49/kube-multus/0.log" Apr 17 00:01:22.444039 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:22.444010 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2mrw9_28beae99-07bd-4677-b7d1-d83bd564ca27/network-metrics-daemon/0.log" Apr 17 00:01:22.462512 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:22.462480 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2mrw9_28beae99-07bd-4677-b7d1-d83bd564ca27/kube-rbac-proxy/0.log" Apr 17 00:01:23.593594 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:23.593567 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvvbp_e35128f8-82c9-4513-a410-0656f5f37ece/ovn-controller/0.log" Apr 17 00:01:23.623273 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:23.623248 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvvbp_e35128f8-82c9-4513-a410-0656f5f37ece/ovn-acl-logging/0.log" Apr 17 00:01:23.642038 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:23.642015 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvvbp_e35128f8-82c9-4513-a410-0656f5f37ece/kube-rbac-proxy-node/0.log" Apr 17 00:01:23.662506 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:23.662477 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvvbp_e35128f8-82c9-4513-a410-0656f5f37ece/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 00:01:23.679137 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:23.679119 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvvbp_e35128f8-82c9-4513-a410-0656f5f37ece/northd/0.log" Apr 17 00:01:23.698659 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:23.698642 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvvbp_e35128f8-82c9-4513-a410-0656f5f37ece/nbdb/0.log" Apr 17 00:01:23.720593 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:23.720575 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvvbp_e35128f8-82c9-4513-a410-0656f5f37ece/sbdb/0.log" Apr 17 00:01:23.811773 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:23.811742 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jvvbp_e35128f8-82c9-4513-a410-0656f5f37ece/ovnkube-controller/0.log" Apr 17 00:01:25.162414 ip-10-0-136-147 kubenswrapper[2569]: I0417 00:01:25.162383 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hc8fq_ca2349c9-d7e3-465e-a9da-79633f8e3aaa/network-check-target-container/0.log"