Apr 17 17:22:26.432592 ip-10-0-135-127 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 17:22:26.432604 ip-10-0-135-127 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 17:22:26.432610 ip-10-0-135-127 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 17:22:26.432823 ip-10-0-135-127 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 17:22:36.489148 ip-10-0-135-127 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 17:22:36.489167 ip-10-0-135-127 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 87bf233e92f742fb8cd922a09bfcf6c9 -- Apr 17 17:25:00.136443 ip-10-0-135-127 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:25:00.629305 ip-10-0-135-127 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:00.629305 ip-10-0-135-127 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:25:00.629305 ip-10-0-135-127 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:00.629305 ip-10-0-135-127 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:25:00.629305 ip-10-0-135-127 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:25:00.632383 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.632297 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:25:00.635223 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635209 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:00.635223 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635222 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635227 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635236 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635240 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635243 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635246 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635250 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635254 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635257 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635260 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635263 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635266 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635269 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635272 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635275 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635277 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635280 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635283 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635286 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635289 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:00.635289 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635292 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635296 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635299 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635302 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635305 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635308 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635310 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635313 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635315 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635319 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635321 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635324 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635326 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635328 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635331 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635334 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635337 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635339 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635341 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635344 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:00.635751 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635346 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635348 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635351 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635353 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635356 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635358 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635361 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635363 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635365 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635368 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635370 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635372 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635375 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635378 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635381 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635384 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635387 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635390 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635393 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635395 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:00.636522 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635398 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635400 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635402 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635405 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635407 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635410 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635412 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635414 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635417 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635420 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635422 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635426 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635429 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635431 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635434 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635436 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635439 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635441 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635443 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:00.637313 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635446 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635448 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635450 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635453 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635455 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.635458 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636223 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636238 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636244 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636249 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636253 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636257 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636267 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636272 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636277 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636281 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636285 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636289 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636294 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:00.637803 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636298 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636303 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636308 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636313 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636317 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636323 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636332 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636336 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636340 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636345 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636349 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636353 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636357 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636362 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636366 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636371 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636375 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636378 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636389 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636393 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:00.638265 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636397 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636402 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636406 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636410 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636414 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636418 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636423 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636427 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636431 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636435 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636439 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636447 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636451 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636455 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636459 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636464 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636468 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636473 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636477 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636482 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:00.638734 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636486 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636490 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636496 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636506 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636510 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636515 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636519 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636523 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636527 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636531 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636536 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636541 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636545 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636550 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636561 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636567 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636576 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636580 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636587 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636591 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:00.639236 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636595 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636599 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636603 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636608 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636612 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636616 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636620 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636629 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636634 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636638 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636643 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636646 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.636651 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636842 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636857 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636864 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636869 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636874 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636880 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636884 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636889 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:25:00.639703 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636892 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636896 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636902 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636924 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636947 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636952 2574 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636956 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636961 2574 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636966 2574 flags.go:64] FLAG: --cloud-config="" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636969 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636973 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636980 2574 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636987 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636991 2574 flags.go:64] FLAG: --config-dir="" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636994 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.636998 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637003 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637007 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637010 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637013 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637019 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637022 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637026 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637030 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637033 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:25:00.640205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637038 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637041 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637045 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637048 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637054 2574 flags.go:64] FLAG: --enable-server="true" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637057 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637062 2574 flags.go:64] FLAG: --event-burst="100" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637065 2574 flags.go:64] FLAG: --event-qps="50" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637068 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637551 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637562 2574 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637574 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637578 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637581 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637585 2574 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637588 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637591 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637594 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637597 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637600 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637603 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637606 2574 flags.go:64] FLAG: --feature-gates="" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637610 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637613 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637616 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:25:00.640784 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637620 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637623 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637626 2574 flags.go:64] FLAG: --help="false" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637629 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-135-127.ec2.internal" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637633 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637636 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637639 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637643 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637647 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637649 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637652 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637655 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637658 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637662 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637665 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637668 2574 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637671 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637674 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637678 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637681 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637684 2574 flags.go:64] FLAG: --lock-file="" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637686 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637689 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637692 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:25:00.641385 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637698 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637701 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637704 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637706 2574 flags.go:64] FLAG: --logging-format="text" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637709 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637713 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637716 2574 flags.go:64] FLAG: --manifest-url="" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637719 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637723 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637726 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637730 2574 flags.go:64] FLAG: --max-pods="110" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637734 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637737 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637740 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637743 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637745 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637748 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637751 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637760 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637763 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637766 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637770 2574 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637773 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:25:00.641961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637777 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637780 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637783 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637787 2574 flags.go:64] FLAG: --port="10250" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637790 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637793 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07233c64bc18bc8b3" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637797 2574 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637799 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637802 2574 flags.go:64] FLAG: --register-node="true" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637806 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637809 2574 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637813 2574 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637816 2574 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637820 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637823 2574 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637826 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637830 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637832 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637835 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637839 2574 flags.go:64] FLAG: --runonce="false" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637842 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637845 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637848 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637851 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637854 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637857 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:25:00.642553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637860 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637863 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637866 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637869 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637872 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637875 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637878 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637881 2574 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637884 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637890 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637897 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637900 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637904 2574 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637907 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637910 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637913 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637916 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637919 2574 flags.go:64] FLAG: --v="2" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637924 2574 flags.go:64] FLAG: --version="false" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637928 2574 flags.go:64] FLAG: --vmodule="" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637932 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.637935 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638030 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638033 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:00.643152 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638037 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638040 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638044 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638048 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638051 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638055 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638057 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638060 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638063 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638066 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638069 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638071 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638074 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638077 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638079 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638082 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638085 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638090 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638094 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:00.643732 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638096 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638099 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638102 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638104 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638107 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638109 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638112 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638114 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638117 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638120 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638122 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638125 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638127 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638130 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638133 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638135 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638139 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638141 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638144 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638147 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:00.644231 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638149 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638152 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638154 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638157 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638159 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638161 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638164 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638180 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638183 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638186 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638190 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638193 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638196 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638199 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638201 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638204 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638207 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638209 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638212 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:00.644724 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638214 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638217 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638219 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638222 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638224 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638227 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638229 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638232 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638234 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638237 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638240 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638243 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638245 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638248 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638250 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638253 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638255 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638258 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638260 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638263 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:00.645196 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638265 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638268 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638271 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638274 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638280 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.638283 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.638969 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.645327 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.645341 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645385 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645390 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645393 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645396 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645400 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645404 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645407 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:00.645716 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645410 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645412 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645415 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645417 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645420 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645422 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645425 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645427 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645430 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645432 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645435 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645438 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645440 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645443 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645445 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645448 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645451 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645454 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645458 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:00.646103 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645461 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645464 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645467 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645470 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645472 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645477 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645480 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645483 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645485 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645488 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645490 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645492 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645495 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645497 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645500 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645503 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645505 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645508 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645510 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645513 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:00.646565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645515 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645518 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645520 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645523 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645525 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645527 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645530 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645532 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645535 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645537 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645540 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645542 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645545 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645547 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645550 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645552 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645554 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645557 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645561 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:00.647036 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645563 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645565 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645568 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645571 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645573 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645576 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645578 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645580 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645583 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645585 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645588 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645590 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645593 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645595 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645597 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645600 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645602 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645604 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645607 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645609 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:00.647492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645612 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.645616 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645706 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645711 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645713 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645717 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645720 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645722 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645725 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645728 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645732 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645735 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645738 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645741 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645743 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:25:00.647957 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645747 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645751 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645754 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645756 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645759 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645761 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645764 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645767 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645770 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645772 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645775 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645777 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645780 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645782 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645785 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645787 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645789 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645792 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645794 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645797 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:25:00.648405 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645799 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645801 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645804 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645807 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645809 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645811 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645814 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645817 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645820 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645822 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645825 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645828 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645830 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645833 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645835 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645838 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645840 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645842 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645845 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:25:00.648925 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645847 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645850 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645852 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645854 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645857 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645860 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645864 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645867 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645869 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645872 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645874 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645877 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645879 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645882 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645884 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645886 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645889 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645891 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645894 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:25:00.649386 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645897 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645900 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645903 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645905 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645908 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645910 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645913 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645915 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645918 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645920 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645923 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645925 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645927 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645930 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:00.645932 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.645937 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:25:00.649824 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.646900 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:25:00.651061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.651047 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:25:00.652543 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.652529 2574 server.go:1019] "Starting client certificate rotation" Apr 17 17:25:00.652656 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.652639 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:25:00.652692 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.652680 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:25:00.680985 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.680966 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:25:00.683497 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.683479 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:25:00.700659 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.700637 2574 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:25:00.709143 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.709120 2574 log.go:25] "Validated CRI v1 image API" Apr 17 17:25:00.710534 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.710514 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:25:00.713422 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.713399 2574 fs.go:135] Filesystem UUIDs: map[0c6d61e4-0299-4e80-8a6d-da0ffb13e57a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9deef47d-7cb3-4952-b7a2-ea38f663ada7:/dev/nvme0n1p3] Apr 17 17:25:00.713494 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.713419 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:25:00.716479 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.716461 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:25:00.719430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.719328 2574 manager.go:217] Machine: {Timestamp:2026-04-17 17:25:00.717090528 +0000 UTC m=+0.453304352 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200138 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec204328a3706270c57cf34223a8ee42 SystemUUID:ec204328-a370-6270-c57c-f34223a8ee42 BootID:87bf233e-92f7-42fb-8cd9-22a09bfcf6c9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:aa:b8:34:45:ff Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:aa:b8:34:45:ff Speed:0 Mtu:9001} {Name:ovs-system MacAddress:52:27:4a:28:ec:b5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:25:00.719430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.719425 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:25:00.719567 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.719554 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:25:00.721997 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.721976 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:25:00.722137 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.721999 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-127.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:25:00.722199 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.722146 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:25:00.722199 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.722155 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:25:00.722199 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.722180 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:25:00.723103 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.723092 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:25:00.725194 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.725183 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:25:00.725294 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.725285 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:25:00.727922 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.727912 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:25:00.727956 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.727928 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:25:00.727956 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.727941 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:25:00.727956 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.727951 2574 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:25:00.728071 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.727959 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:25:00.729228 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.729216 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:25:00.729267 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.729234 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:25:00.732612 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.732596 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:25:00.733902 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.733888 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:25:00.735242 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735231 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:25:00.735285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735256 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:25:00.735285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735265 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:25:00.735285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735273 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:25:00.735285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735281 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:25:00.735392 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735290 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:25:00.735392 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735298 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:25:00.735392 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735308 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:25:00.735392 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735328 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:25:00.735392 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735337 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:25:00.735392 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735357 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:25:00.735392 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.735370 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:25:00.736287 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.736275 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:25:00.736319 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.736290 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:25:00.739994 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.739834 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:25:00.740053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.740018 2574 server.go:1295] "Started kubelet" Apr 17 17:25:00.740103 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.739931 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:25:00.740103 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.739948 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-127.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:25:00.740103 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.739840 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-127.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:25:00.740245 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.740114 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:25:00.740245 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.740187 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:25:00.740325 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.740199 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:25:00.740877 ip-10-0-135-127 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:25:00.741507 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.741493 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:25:00.742981 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.742965 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:25:00.747228 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.746300 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-127.ec2.internal.18a734dcc005ae60 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-127.ec2.internal,UID:ip-10-0-135-127.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-127.ec2.internal,},FirstTimestamp:2026-04-17 17:25:00.73999728 +0000 UTC m=+0.476211105,LastTimestamp:2026-04-17 17:25:00.73999728 +0000 UTC m=+0.476211105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-127.ec2.internal,}" Apr 17 17:25:00.747961 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.747935 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:25:00.748496 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.748479 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:25:00.749696 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.749678 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:25:00.749696 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.749698 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:25:00.750004 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.749982 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:25:00.750110 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.750095 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:25:00.750110 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.750109 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:25:00.750538 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.750514 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:00.750538 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.750532 2574 factory.go:55] Registering systemd factory Apr 17 17:25:00.750663 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.750575 2574 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:25:00.751033 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.751016 2574 factory.go:153] Registering CRI-O factory Apr 17 17:25:00.751033 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.751033 2574 factory.go:223] Registration of the crio container factory successfully Apr 17 17:25:00.751157 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.751071 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:25:00.751157 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.751106 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:25:00.751157 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.751130 2574 factory.go:103] Registering Raw factory Apr 17 17:25:00.751157 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.751151 2574 manager.go:1196] Started watching for new ooms in manager Apr 17 17:25:00.751521 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.751510 2574 manager.go:319] Starting recovery of all containers Apr 17 17:25:00.754337 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.754304 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-127.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:25:00.754531 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.754508 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:25:00.761322 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.761305 2574 manager.go:324] Recovery completed Apr 17 17:25:00.763010 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.762991 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 17:25:00.765775 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.765762 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:00.767990 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.767976 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:00.768061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.768002 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:00.768061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.768012 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:00.768529 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.768514 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:25:00.768607 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.768530 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:25:00.768607 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.768546 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:25:00.769859 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.769801 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-127.ec2.internal.18a734dcc1b0d231 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-127.ec2.internal,UID:ip-10-0-135-127.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-127.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-127.ec2.internal,},FirstTimestamp:2026-04-17 17:25:00.767990321 +0000 UTC m=+0.504204146,LastTimestamp:2026-04-17 17:25:00.767990321 +0000 UTC m=+0.504204146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-127.ec2.internal,}" Apr 17 17:25:00.771243 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.771228 2574 policy_none.go:49] "None policy: Start" Apr 17 17:25:00.771304 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.771249 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:25:00.771304 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.771264 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:25:00.778077 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.778060 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p2gzf" Apr 17 17:25:00.779862 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.779768 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-127.ec2.internal.18a734dcc1b1121c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-127.ec2.internal,UID:ip-10-0-135-127.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-135-127.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-135-127.ec2.internal,},FirstTimestamp:2026-04-17 17:25:00.768006684 +0000 UTC m=+0.504220509,LastTimestamp:2026-04-17 17:25:00.768006684 +0000 UTC m=+0.504220509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-127.ec2.internal,}" Apr 17 17:25:00.788441 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.788418 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-p2gzf" Apr 17 17:25:00.803882 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.790027 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-127.ec2.internal.18a734dcc1b13746 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-127.ec2.internal,UID:ip-10-0-135-127.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-135-127.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-135-127.ec2.internal,},FirstTimestamp:2026-04-17 17:25:00.768016198 +0000 UTC m=+0.504230023,LastTimestamp:2026-04-17 17:25:00.768016198 +0000 UTC m=+0.504230023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-127.ec2.internal,}" Apr 17 17:25:00.808729 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.808712 2574 manager.go:341] "Starting Device Plugin manager" Apr 17 17:25:00.808819 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.808743 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:25:00.808819 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.808753 2574 server.go:85] "Starting device plugin registration server" Apr 17 17:25:00.809025 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.809013 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:25:00.809061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.809029 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:25:00.809151 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.809127 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:25:00.809262 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.809232 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:25:00.809262 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.809253 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:25:00.809726 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.809705 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:25:00.809801 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.809747 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:00.851554 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.851526 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:25:00.852616 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.852598 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:25:00.852678 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.852631 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:25:00.852678 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.852659 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:25:00.852678 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.852670 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:25:00.852783 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.852711 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:25:00.855779 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.855761 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:00.909932 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.909833 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:00.910831 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.910815 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:00.910911 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.910845 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:00.910911 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.910854 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:00.910911 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.910876 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-127.ec2.internal" Apr 17 17:25:00.919662 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.919642 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-127.ec2.internal" Apr 17 17:25:00.919753 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.919669 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-127.ec2.internal\": node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:00.932803 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.932783 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:00.953823 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.953798 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal"] Apr 17 17:25:00.953916 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.953873 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:00.954667 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.954652 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:00.954750 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.954684 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:00.954750 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.954698 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:00.956014 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956000 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:00.956140 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:00.956230 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956155 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:00.956690 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956676 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:00.956761 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956688 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:00.956761 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956703 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:00.956761 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956711 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:00.956761 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956718 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:00.956992 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.956721 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:00.957911 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.957895 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 17:25:00.957987 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.957919 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:25:00.958555 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.958539 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:25:00.958632 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.958574 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:25:00.958632 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:00.958589 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:25:00.983845 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.983825 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-127.ec2.internal\" not found" node="ip-10-0-135-127.ec2.internal" Apr 17 17:25:00.988019 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:00.988002 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-127.ec2.internal\" not found" node="ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.033610 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.033588 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.134002 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.133962 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.151323 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.151301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/356446819b043d77b4ba2d5504f23404-config\") pod \"kube-apiserver-proxy-ip-10-0-135-127.ec2.internal\" (UID: \"356446819b043d77b4ba2d5504f23404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.151410 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.151335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.151410 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.151354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.234719 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.234647 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.252040 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.252015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.252122 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.252050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.252122 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.252073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/356446819b043d77b4ba2d5504f23404-config\") pod \"kube-apiserver-proxy-ip-10-0-135-127.ec2.internal\" (UID: \"356446819b043d77b4ba2d5504f23404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.252122 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.252116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.252249 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.252115 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d5d09bbd1af6f808e94311449d7cd444-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal\" (UID: \"d5d09bbd1af6f808e94311449d7cd444\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.252249 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.252118 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/356446819b043d77b4ba2d5504f23404-config\") pod \"kube-apiserver-proxy-ip-10-0-135-127.ec2.internal\" (UID: \"356446819b043d77b4ba2d5504f23404\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.287174 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.287131 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.290021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.290001 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.335649 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.335624 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.436240 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.436196 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.536676 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.536643 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.637252 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.637207 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.651504 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.651478 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:25:01.651626 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.651611 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:25:01.729077 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.729045 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:01.732367 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.732346 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:01.737464 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.737447 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.748909 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.748889 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:25:01.759884 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.759865 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:25:01.782745 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.782725 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m84lp" Apr 17 17:25:01.790817 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.790761 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:20:00 +0000 UTC" deadline="2027-09-21 21:20:00.666742042 +0000 UTC" Apr 17 17:25:01.790817 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.790786 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12531h54m58.875958177s" Apr 17 17:25:01.790817 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.790794 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m84lp" Apr 17 17:25:01.838414 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:01.838381 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-127.ec2.internal\" not found" Apr 17 17:25:01.887514 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.887491 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:01.950103 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.950083 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.961100 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.961075 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:25:01.961986 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.961973 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" Apr 17 17:25:01.972465 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.972444 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:25:01.989051 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:01.989025 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d09bbd1af6f808e94311449d7cd444.slice/crio-a55c2d2de407e3c25b0a84cb64105ed98db4f2e8fddfcb02cfe0bd3b90857154 WatchSource:0}: Error finding container a55c2d2de407e3c25b0a84cb64105ed98db4f2e8fddfcb02cfe0bd3b90857154: Status 404 returned error can't find the container with id a55c2d2de407e3c25b0a84cb64105ed98db4f2e8fddfcb02cfe0bd3b90857154 Apr 17 17:25:01.989663 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:01.989649 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod356446819b043d77b4ba2d5504f23404.slice/crio-bc0b2f1574263f3582ec89ed9dc6249f871387afa9890f7ec6196cccc0c02dba WatchSource:0}: Error finding container bc0b2f1574263f3582ec89ed9dc6249f871387afa9890f7ec6196cccc0c02dba: Status 404 returned error can't find the container with id bc0b2f1574263f3582ec89ed9dc6249f871387afa9890f7ec6196cccc0c02dba Apr 17 17:25:01.995196 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:01.995161 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:25:02.729838 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.729806 2574 apiserver.go:52] "Watching apiserver" Apr 17 17:25:02.742125 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.742087 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:25:02.743507 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.743073 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-nrxzd","openshift-dns/node-resolver-4dfq5","openshift-image-registry/node-ca-d75sg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal","openshift-multus/multus-5b72d","openshift-multus/network-metrics-daemon-cqtr2","openshift-network-diagnostics/network-check-target-vvm4h","openshift-ovn-kubernetes/ovnkube-node-sgzx2","kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz","openshift-cluster-node-tuning-operator/tuned-5wf77","openshift-multus/multus-additional-cni-plugins-cf28x","openshift-network-operator/iptables-alerter-wt5ct"] Apr 17 17:25:02.745155 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.744762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:02.745155 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:02.744847 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:02.746356 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.745926 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.748956 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.748471 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.750196 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.749538 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8fntx\"" Apr 17 17:25:02.750196 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.749714 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:25:02.750196 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.749913 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:02.750196 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.750093 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:02.752304 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.751302 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:25:02.752304 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.751491 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cjj9q\"" Apr 17 17:25:02.752304 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.751631 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:25:02.752304 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.751794 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:25:02.752304 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.752154 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:25:02.753151 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.753130 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:02.753255 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:02.753213 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:02.755056 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.754629 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.756323 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.756074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:02.756323 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.756205 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.757606 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.757380 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.757895 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.758244 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7lcdx\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.758599 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.758786 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.758809 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.759034 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nxwz8\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.759348 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.759803 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.759920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2a664808-7e4f-495a-bd8e-3278a11bb604-agent-certs\") pod \"konnectivity-agent-nrxzd\" (UID: \"2a664808-7e4f-495a-bd8e-3278a11bb604\") " pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.759950 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-cni-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.759974 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-os-release\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.759999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-kubelet\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760048 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blq4k\" (UniqueName: \"kubernetes.io/projected/f3033f4c-b4a1-45de-8f08-0fbf65425c86-kube-api-access-blq4k\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760071 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807ce854-eb81-42f4-8fb8-0060d033ffbf-cni-binary-copy\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-netns\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-hostroot\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760198 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-etc-kubernetes\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760258 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760262 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:02.761449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01868bf7-f6d2-461d-8bf1-006126117f62-host-slash\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-system-cni-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760328 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-k8s-cni-cncf-io\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-cni-bin\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760377 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-cni-multus\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760412 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-conf-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01868bf7-f6d2-461d-8bf1-006126117f62-iptables-alerter-script\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760468 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvslt\" (UniqueName: \"kubernetes.io/projected/01868bf7-f6d2-461d-8bf1-006126117f62-kube-api-access-zvslt\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2a664808-7e4f-495a-bd8e-3278a11bb604-konnectivity-ca\") pod \"konnectivity-agent-nrxzd\" (UID: \"2a664808-7e4f-495a-bd8e-3278a11bb604\") " pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-cnibin\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760586 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-socket-dir-parent\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760604 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-daemon-config\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760636 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-65brh\"" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760703 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760800 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.760637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-multus-certs\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.761008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf7lz\" (UniqueName: \"kubernetes.io/projected/807ce854-eb81-42f4-8fb8-0060d033ffbf-kube-api-access-hf7lz\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.762645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.761281 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:02.763426 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.763294 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.763426 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.763346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.764427 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.764135 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vnn4x\"" Apr 17 17:25:02.764427 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.764187 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:02.764572 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.764429 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:25:02.764915 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.764895 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mvbh9\"" Apr 17 17:25:02.765180 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.764895 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:25:02.765813 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.765657 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:25:02.766212 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.766004 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:25:02.766347 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.766331 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nkq72\"" Apr 17 17:25:02.767484 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.766932 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:25:02.767484 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.767122 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:25:02.767484 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.767273 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:25:02.767484 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.767329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-s5krx\"" Apr 17 17:25:02.792941 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.792905 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:20:01 +0000 UTC" deadline="2027-10-12 07:37:45.522795528 +0000 UTC" Apr 17 17:25:02.792941 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.792941 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13022h12m42.72985795s" Apr 17 17:25:02.851081 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.851060 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:25:02.857301 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.857221 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" event={"ID":"356446819b043d77b4ba2d5504f23404","Type":"ContainerStarted","Data":"bc0b2f1574263f3582ec89ed9dc6249f871387afa9890f7ec6196cccc0c02dba"} Apr 17 17:25:02.858266 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.858242 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" event={"ID":"d5d09bbd1af6f808e94311449d7cd444","Type":"ContainerStarted","Data":"a55c2d2de407e3c25b0a84cb64105ed98db4f2e8fddfcb02cfe0bd3b90857154"} Apr 17 17:25:02.861556 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-sys\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.861668 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-cnibin\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.861668 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-modprobe-d\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.861668 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-env-overrides\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.861668 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861647 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4m72\" (UniqueName: \"kubernetes.io/projected/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-kube-api-access-p4m72\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2a664808-7e4f-495a-bd8e-3278a11bb604-agent-certs\") pod \"konnectivity-agent-nrxzd\" (UID: \"2a664808-7e4f-495a-bd8e-3278a11bb604\") " pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861695 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02498340-44b9-4152-9802-82fbeecce918-host\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-tuned\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-cni-netd\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861817 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-system-cni-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-cni-bin\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-conf-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.861926 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861924 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-socket-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwcl\" (UniqueName: \"kubernetes.io/projected/474e9a38-21a3-415a-a945-80417640d569-kube-api-access-vcwcl\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861973 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-ovn\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.861996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-run-ovn-kubernetes\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862023 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvslt\" (UniqueName: \"kubernetes.io/projected/01868bf7-f6d2-461d-8bf1-006126117f62-kube-api-access-zvslt\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-os-release\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-cni-binary-copy\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-run-netns\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-cni-bin\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862239 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-system-cni-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-conf-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-node-log\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blq4k\" (UniqueName: \"kubernetes.io/projected/f3033f4c-b4a1-45de-8f08-0fbf65425c86-kube-api-access-blq4k\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862324 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-cnibin\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.862443 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862363 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862489 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-etc-selinux\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862488 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862521 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-sys-fs\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovn-node-metrics-cert\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-856v5\" (UniqueName: \"kubernetes.io/projected/d31b25a9-8351-4624-8ef6-a1389bdd2474-kube-api-access-856v5\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-netns\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862676 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62nhl\" (UniqueName: \"kubernetes.io/projected/c1a26ef0-b655-4041-9be3-b2b3b545a29f-kube-api-access-62nhl\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862701 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-netns\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862752 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02498340-44b9-4152-9802-82fbeecce918-serviceca\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862786 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-etc-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d31b25a9-8351-4624-8ef6-a1389bdd2474-hosts-file\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-system-cni-dir\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-multus-certs\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862942 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-registration-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-var-lib-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.863053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.862991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-multus-certs\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-lib-modules\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01868bf7-f6d2-461d-8bf1-006126117f62-iptables-alerter-script\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2a664808-7e4f-495a-bd8e-3278a11bb604-konnectivity-ca\") pod \"konnectivity-agent-nrxzd\" (UID: \"2a664808-7e4f-495a-bd8e-3278a11bb604\") " pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf7lz\" (UniqueName: \"kubernetes.io/projected/807ce854-eb81-42f4-8fb8-0060d033ffbf-kube-api-access-hf7lz\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-kubernetes\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863199 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysctl-d\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-os-release\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-kubelet\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863262 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysctl-conf\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-var-lib-kubelet\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807ce854-eb81-42f4-8fb8-0060d033ffbf-cni-binary-copy\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-hostroot\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863347 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-etc-kubernetes\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-kubelet\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-etc-kubernetes\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqtcm\" (UniqueName: \"kubernetes.io/projected/bdfc917a-4e35-4bac-8c08-84c70e29539e-kube-api-access-lqtcm\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.863804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-slash\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-hostroot\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovnkube-config\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01868bf7-f6d2-461d-8bf1-006126117f62-host-slash\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863549 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-run\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-log-socket\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863611 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-cni-bin\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863634 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d31b25a9-8351-4624-8ef6-a1389bdd2474-tmp-dir\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863659 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-socket-dir-parent\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-systemd\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863696 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/01868bf7-f6d2-461d-8bf1-006126117f62-iptables-alerter-script\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863717 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-host\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863754 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/01868bf7-f6d2-461d-8bf1-006126117f62-host-slash\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863747 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-cni-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysconfig\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-cni-dir\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-os-release\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.864584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863809 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863842 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-kubelet\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-socket-dir-parent\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863866 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-systemd\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovnkube-script-lib\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/807ce854-eb81-42f4-8fb8-0060d033ffbf-cni-binary-copy\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdfc917a-4e35-4bac-8c08-84c70e29539e-tmp\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.863988 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-cnibin\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dbq4\" (UniqueName: \"kubernetes.io/projected/02498340-44b9-4152-9802-82fbeecce918-kube-api-access-7dbq4\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864039 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-systemd-units\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-k8s-cni-cncf-io\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-cni-multus\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864148 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-daemon-config\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-device-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.865204 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:02.864209 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:02.865806 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:02.864276 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:03.364256903 +0000 UTC m=+3.100470732 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:02.865806 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-var-lib-cni-multus\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.865806 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864319 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2a664808-7e4f-495a-bd8e-3278a11bb604-konnectivity-ca\") pod \"konnectivity-agent-nrxzd\" (UID: \"2a664808-7e4f-495a-bd8e-3278a11bb604\") " pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:02.865806 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864336 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/807ce854-eb81-42f4-8fb8-0060d033ffbf-host-run-k8s-cni-cncf-io\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.865806 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.864675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/807ce854-eb81-42f4-8fb8-0060d033ffbf-multus-daemon-config\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.865806 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.865604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2a664808-7e4f-495a-bd8e-3278a11bb604-agent-certs\") pod \"konnectivity-agent-nrxzd\" (UID: \"2a664808-7e4f-495a-bd8e-3278a11bb604\") " pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:02.872965 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:02.872727 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:02.872965 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:02.872752 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:02.872965 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:02.872765 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5s2nn for pod openshift-network-diagnostics/network-check-target-vvm4h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:02.872965 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:02.872865 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn podName:4b83a9e4-5073-4105-bc72-4980376e169f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:03.372846807 +0000 UTC m=+3.109060622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5s2nn" (UniqueName: "kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn") pod "network-check-target-vvm4h" (UID: "4b83a9e4-5073-4105-bc72-4980376e169f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:02.875411 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.875393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf7lz\" (UniqueName: \"kubernetes.io/projected/807ce854-eb81-42f4-8fb8-0060d033ffbf-kube-api-access-hf7lz\") pod \"multus-5b72d\" (UID: \"807ce854-eb81-42f4-8fb8-0060d033ffbf\") " pod="openshift-multus/multus-5b72d" Apr 17 17:25:02.875531 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.875513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvslt\" (UniqueName: \"kubernetes.io/projected/01868bf7-f6d2-461d-8bf1-006126117f62-kube-api-access-zvslt\") pod \"iptables-alerter-wt5ct\" (UID: \"01868bf7-f6d2-461d-8bf1-006126117f62\") " pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:02.875874 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.875857 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blq4k\" (UniqueName: \"kubernetes.io/projected/f3033f4c-b4a1-45de-8f08-0fbf65425c86-kube-api-access-blq4k\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:02.965397 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqtcm\" (UniqueName: \"kubernetes.io/projected/bdfc917a-4e35-4bac-8c08-84c70e29539e-kube-api-access-lqtcm\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.965397 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-slash\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.965623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovnkube-config\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.965623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-run\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.965623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-slash\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.965623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-run\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.965623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965581 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-log-socket\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.965623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-cni-bin\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.965865 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965626 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-log-socket\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.965865 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965633 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d31b25a9-8351-4624-8ef6-a1389bdd2474-tmp-dir\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.965865 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965671 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-cni-bin\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.965865 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-systemd\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.965865 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-host\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.965865 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysconfig\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.965865 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965825 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.965865 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965850 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-kubelet\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965872 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-systemd\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovnkube-script-lib\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdfc917a-4e35-4bac-8c08-84c70e29539e-tmp\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-cnibin\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d31b25a9-8351-4624-8ef6-a1389bdd2474-tmp-dir\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dbq4\" (UniqueName: \"kubernetes.io/projected/02498340-44b9-4152-9802-82fbeecce918-kube-api-access-7dbq4\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.965987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-systemd-units\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-systemd\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovnkube-config\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-device-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-sys\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-modprobe-d\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966115 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysconfig\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-env-overrides\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-host\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4m72\" (UniqueName: \"kubernetes.io/projected/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-kube-api-access-p4m72\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.966239 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02498340-44b9-4152-9802-82fbeecce918-host\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-device-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-cnibin\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-kubelet\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-modprobe-d\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966343 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-systemd\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966385 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-systemd-units\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-sys\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966430 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02498340-44b9-4152-9802-82fbeecce918-host\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966464 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-tuned\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-cni-netd\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966546 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-cni-netd\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-socket-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-env-overrides\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwcl\" (UniqueName: \"kubernetes.io/projected/474e9a38-21a3-415a-a945-80417640d569-kube-api-access-vcwcl\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966641 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966653 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-ovn\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-run-ovn-kubernetes\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966690 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-socket-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-os-release\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966708 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-run-ovn\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-cni-binary-copy\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-run-netns\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-os-release\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-run-ovn-kubernetes\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-node-log\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-host-run-netns\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovnkube-script-lib\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.967762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-etc-selinux\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-node-log\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-sys-fs\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovn-node-metrics-cert\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966921 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-etc-selinux\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-sys-fs\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-856v5\" (UniqueName: \"kubernetes.io/projected/d31b25a9-8351-4624-8ef6-a1389bdd2474-kube-api-access-856v5\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.966983 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62nhl\" (UniqueName: \"kubernetes.io/projected/c1a26ef0-b655-4041-9be3-b2b3b545a29f-kube-api-access-62nhl\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967134 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-cni-binary-copy\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02498340-44b9-4152-9802-82fbeecce918-serviceca\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-etc-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d31b25a9-8351-4624-8ef6-a1389bdd2474-hosts-file\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-system-cni-dir\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-registration-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967352 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-var-lib-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.968425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-lib-modules\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-kubernetes\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysctl-d\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/474e9a38-21a3-415a-a945-80417640d569-system-cni-dir\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967555 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysctl-d\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-etc-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d31b25a9-8351-4624-8ef6-a1389bdd2474-hosts-file\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02498340-44b9-4152-9802-82fbeecce918-serviceca\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysctl-conf\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967697 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-var-lib-openvswitch\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-var-lib-kubelet\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967781 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c1a26ef0-b655-4041-9be3-b2b3b545a29f-registration-dir\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-sysctl-conf\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967824 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/474e9a38-21a3-415a-a945-80417640d569-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-var-lib-kubelet\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967879 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-kubernetes\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.967879 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bdfc917a-4e35-4bac-8c08-84c70e29539e-lib-modules\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.968779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bdfc917a-4e35-4bac-8c08-84c70e29539e-tmp\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.969899 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.969081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bdfc917a-4e35-4bac-8c08-84c70e29539e-etc-tuned\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.970092 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.969987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-ovn-node-metrics-cert\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.973905 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.973881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqtcm\" (UniqueName: \"kubernetes.io/projected/bdfc917a-4e35-4bac-8c08-84c70e29539e-kube-api-access-lqtcm\") pod \"tuned-5wf77\" (UID: \"bdfc917a-4e35-4bac-8c08-84c70e29539e\") " pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:02.974315 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.974255 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dbq4\" (UniqueName: \"kubernetes.io/projected/02498340-44b9-4152-9802-82fbeecce918-kube-api-access-7dbq4\") pod \"node-ca-d75sg\" (UID: \"02498340-44b9-4152-9802-82fbeecce918\") " pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:02.976101 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.976071 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-856v5\" (UniqueName: \"kubernetes.io/projected/d31b25a9-8351-4624-8ef6-a1389bdd2474-kube-api-access-856v5\") pod \"node-resolver-4dfq5\" (UID: \"d31b25a9-8351-4624-8ef6-a1389bdd2474\") " pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:02.976570 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.976522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4m72\" (UniqueName: \"kubernetes.io/projected/110a4c18-b7af-4bb1-8f5e-f332eb485ccb-kube-api-access-p4m72\") pod \"ovnkube-node-sgzx2\" (UID: \"110a4c18-b7af-4bb1-8f5e-f332eb485ccb\") " pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:02.976676 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.976657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62nhl\" (UniqueName: \"kubernetes.io/projected/c1a26ef0-b655-4041-9be3-b2b3b545a29f-kube-api-access-62nhl\") pod \"aws-ebs-csi-driver-node-ld9sz\" (UID: \"c1a26ef0-b655-4041-9be3-b2b3b545a29f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:02.978475 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:02.978452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwcl\" (UniqueName: \"kubernetes.io/projected/474e9a38-21a3-415a-a945-80417640d569-kube-api-access-vcwcl\") pod \"multus-additional-cni-plugins-cf28x\" (UID: \"474e9a38-21a3-415a-a945-80417640d569\") " pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:03.066529 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.066496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wt5ct" Apr 17 17:25:03.075393 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.075371 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5b72d" Apr 17 17:25:03.086053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.086028 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d75sg" Apr 17 17:25:03.093687 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.093662 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:03.101297 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.101275 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:03.108854 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.108834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5wf77" Apr 17 17:25:03.113728 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.113710 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:25:03.117863 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.117846 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cf28x" Apr 17 17:25:03.123403 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.123381 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4dfq5" Apr 17 17:25:03.130973 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.130954 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" Apr 17 17:25:03.370205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.370111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:03.370367 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:03.370291 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:03.370367 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:03.370355 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:04.370337652 +0000 UTC m=+4.106551479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:03.471080 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.471051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:03.471234 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:03.471193 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:03.471234 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:03.471207 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:03.471234 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:03.471217 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5s2nn for pod openshift-network-diagnostics/network-check-target-vvm4h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:03.471335 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:03.471263 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn podName:4b83a9e4-5073-4105-bc72-4980376e169f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:04.471248447 +0000 UTC m=+4.207462259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5s2nn" (UniqueName: "kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn") pod "network-check-target-vvm4h" (UID: "4b83a9e4-5073-4105-bc72-4980376e169f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:03.656565 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.656531 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474e9a38_21a3_415a_a945_80417640d569.slice/crio-307fe17fc6c3500215822c5fd9bfdda91dc84ca33299d2e0e36c30d22bc70fd4 WatchSource:0}: Error finding container 307fe17fc6c3500215822c5fd9bfdda91dc84ca33299d2e0e36c30d22bc70fd4: Status 404 returned error can't find the container with id 307fe17fc6c3500215822c5fd9bfdda91dc84ca33299d2e0e36c30d22bc70fd4 Apr 17 17:25:03.668829 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.668803 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod807ce854_eb81_42f4_8fb8_0060d033ffbf.slice/crio-da98340c85b551488fc7784c1706b71f2a7d7cd80e531ac351e546c919394342 WatchSource:0}: Error finding container da98340c85b551488fc7784c1706b71f2a7d7cd80e531ac351e546c919394342: Status 404 returned error can't find the container with id da98340c85b551488fc7784c1706b71f2a7d7cd80e531ac351e546c919394342 Apr 17 17:25:03.673330 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.673310 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01868bf7_f6d2_461d_8bf1_006126117f62.slice/crio-02de76d74dd2cb4a4b4fc2e81b635b6f7dbbd1d4a2152d1164fc22672683658a WatchSource:0}: Error finding container 02de76d74dd2cb4a4b4fc2e81b635b6f7dbbd1d4a2152d1164fc22672683658a: Status 404 returned error can't find the container with id 02de76d74dd2cb4a4b4fc2e81b635b6f7dbbd1d4a2152d1164fc22672683658a Apr 17 17:25:03.674715 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.674691 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1a26ef0_b655_4041_9be3_b2b3b545a29f.slice/crio-5dba1536d909194847491cacf546e1a16af9a19ea25225744cc451c5632e555a WatchSource:0}: Error finding container 5dba1536d909194847491cacf546e1a16af9a19ea25225744cc451c5632e555a: Status 404 returned error can't find the container with id 5dba1536d909194847491cacf546e1a16af9a19ea25225744cc451c5632e555a Apr 17 17:25:03.675547 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.675504 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd31b25a9_8351_4624_8ef6_a1389bdd2474.slice/crio-f6c3dccf4c122c4a9e105b594a284b5a11852d8570b06d50e060e4734ce09394 WatchSource:0}: Error finding container f6c3dccf4c122c4a9e105b594a284b5a11852d8570b06d50e060e4734ce09394: Status 404 returned error can't find the container with id f6c3dccf4c122c4a9e105b594a284b5a11852d8570b06d50e060e4734ce09394 Apr 17 17:25:03.676996 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.676976 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02498340_44b9_4152_9802_82fbeecce918.slice/crio-37bd028903ecfb660404a90ea7db77b1a8ebe227e4ef8b1ff8d7b2dcf94cb257 WatchSource:0}: Error finding container 37bd028903ecfb660404a90ea7db77b1a8ebe227e4ef8b1ff8d7b2dcf94cb257: Status 404 returned error can't find the container with id 37bd028903ecfb660404a90ea7db77b1a8ebe227e4ef8b1ff8d7b2dcf94cb257 Apr 17 17:25:03.677744 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.677723 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110a4c18_b7af_4bb1_8f5e_f332eb485ccb.slice/crio-69dba59856ee88830e5d82c4a5d1f35f37a6f358a538f574609f39f817a8eba7 WatchSource:0}: Error finding container 69dba59856ee88830e5d82c4a5d1f35f37a6f358a538f574609f39f817a8eba7: Status 404 returned error can't find the container with id 69dba59856ee88830e5d82c4a5d1f35f37a6f358a538f574609f39f817a8eba7 Apr 17 17:25:03.678661 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.678641 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a664808_7e4f_495a_bd8e_3278a11bb604.slice/crio-466a21050a8ca604c44dc4791e26490f1e3c73354bd86312e1048a1484a69158 WatchSource:0}: Error finding container 466a21050a8ca604c44dc4791e26490f1e3c73354bd86312e1048a1484a69158: Status 404 returned error can't find the container with id 466a21050a8ca604c44dc4791e26490f1e3c73354bd86312e1048a1484a69158 Apr 17 17:25:03.680930 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:03.680661 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdfc917a_4e35_4bac_8c08_84c70e29539e.slice/crio-46d21c73ce3aa7c60649f10a5bb658878496e68897b2e7623dbb64cc7e62b302 WatchSource:0}: Error finding container 46d21c73ce3aa7c60649f10a5bb658878496e68897b2e7623dbb64cc7e62b302: Status 404 returned error can't find the container with id 46d21c73ce3aa7c60649f10a5bb658878496e68897b2e7623dbb64cc7e62b302 Apr 17 17:25:03.793700 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.793671 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:20:01 +0000 UTC" deadline="2027-11-08 20:38:03.349043647 +0000 UTC" Apr 17 17:25:03.793700 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.793698 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13683h12m59.555348378s" Apr 17 17:25:03.853009 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.852982 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:03.853148 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:03.853079 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:03.861040 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.861014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wt5ct" event={"ID":"01868bf7-f6d2-461d-8bf1-006126117f62","Type":"ContainerStarted","Data":"02de76d74dd2cb4a4b4fc2e81b635b6f7dbbd1d4a2152d1164fc22672683658a"} Apr 17 17:25:03.862508 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.862482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" event={"ID":"356446819b043d77b4ba2d5504f23404","Type":"ContainerStarted","Data":"bd49e50547349de25799ebaf38b6b61260e55067bafd99473eba4c22dda74016"} Apr 17 17:25:03.863351 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.863323 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4dfq5" event={"ID":"d31b25a9-8351-4624-8ef6-a1389bdd2474","Type":"ContainerStarted","Data":"f6c3dccf4c122c4a9e105b594a284b5a11852d8570b06d50e060e4734ce09394"} Apr 17 17:25:03.864214 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.864193 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" event={"ID":"c1a26ef0-b655-4041-9be3-b2b3b545a29f","Type":"ContainerStarted","Data":"5dba1536d909194847491cacf546e1a16af9a19ea25225744cc451c5632e555a"} Apr 17 17:25:03.865109 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.865087 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5b72d" event={"ID":"807ce854-eb81-42f4-8fb8-0060d033ffbf","Type":"ContainerStarted","Data":"da98340c85b551488fc7784c1706b71f2a7d7cd80e531ac351e546c919394342"} Apr 17 17:25:03.866109 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.866079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerStarted","Data":"307fe17fc6c3500215822c5fd9bfdda91dc84ca33299d2e0e36c30d22bc70fd4"} Apr 17 17:25:03.867012 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.866992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5wf77" event={"ID":"bdfc917a-4e35-4bac-8c08-84c70e29539e","Type":"ContainerStarted","Data":"46d21c73ce3aa7c60649f10a5bb658878496e68897b2e7623dbb64cc7e62b302"} Apr 17 17:25:03.867915 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.867888 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nrxzd" event={"ID":"2a664808-7e4f-495a-bd8e-3278a11bb604","Type":"ContainerStarted","Data":"466a21050a8ca604c44dc4791e26490f1e3c73354bd86312e1048a1484a69158"} Apr 17 17:25:03.868845 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.868826 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"69dba59856ee88830e5d82c4a5d1f35f37a6f358a538f574609f39f817a8eba7"} Apr 17 17:25:03.869651 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.869634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d75sg" event={"ID":"02498340-44b9-4152-9802-82fbeecce918","Type":"ContainerStarted","Data":"37bd028903ecfb660404a90ea7db77b1a8ebe227e4ef8b1ff8d7b2dcf94cb257"} Apr 17 17:25:03.876402 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:03.876366 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-127.ec2.internal" podStartSLOduration=2.876354662 podStartE2EDuration="2.876354662s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:03.87582187 +0000 UTC m=+3.612035714" watchObservedRunningTime="2026-04-17 17:25:03.876354662 +0000 UTC m=+3.612568495" Apr 17 17:25:04.381014 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:04.380976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:04.381185 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:04.381154 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:04.381255 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:04.381247 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.381229349 +0000 UTC m=+6.117443166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:04.482212 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:04.482158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:04.482365 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:04.482342 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:04.482365 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:04.482361 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:04.482485 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:04.482373 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5s2nn for pod openshift-network-diagnostics/network-check-target-vvm4h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:04.482485 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:04.482429 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn podName:4b83a9e4-5073-4105-bc72-4980376e169f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:06.482411494 +0000 UTC m=+6.218625312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5s2nn" (UniqueName: "kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn") pod "network-check-target-vvm4h" (UID: "4b83a9e4-5073-4105-bc72-4980376e169f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:04.853274 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:04.853233 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:04.853274 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:04.853379 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:04.892285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:04.892152 2574 generic.go:358] "Generic (PLEG): container finished" podID="d5d09bbd1af6f808e94311449d7cd444" containerID="1789242497cef12c5b1820d2a27791ee8f0a275903a1a0f34c49728d38396f3f" exitCode=0 Apr 17 17:25:04.892839 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:04.892609 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" event={"ID":"d5d09bbd1af6f808e94311449d7cd444","Type":"ContainerDied","Data":"1789242497cef12c5b1820d2a27791ee8f0a275903a1a0f34c49728d38396f3f"} Apr 17 17:25:05.853851 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:05.853819 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:05.854320 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:05.853947 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:05.905062 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:05.905020 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" event={"ID":"d5d09bbd1af6f808e94311449d7cd444","Type":"ContainerStarted","Data":"6eb0dacadf7e5f1ee2f3aad7884ae2d750ed31170ace266ecb6717fe03fed27b"} Apr 17 17:25:06.401461 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:06.401419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:06.401655 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:06.401583 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:06.401716 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:06.401656 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.401637309 +0000 UTC m=+10.137851135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:06.502071 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:06.502026 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:06.502254 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:06.502213 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:06.502254 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:06.502241 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:06.502254 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:06.502254 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5s2nn for pod openshift-network-diagnostics/network-check-target-vvm4h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:06.502415 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:06.502311 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn podName:4b83a9e4-5073-4105-bc72-4980376e169f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:10.502293322 +0000 UTC m=+10.238507139 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5s2nn" (UniqueName: "kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn") pod "network-check-target-vvm4h" (UID: "4b83a9e4-5073-4105-bc72-4980376e169f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:06.853822 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:06.853780 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:06.853989 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:06.853917 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:07.853362 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:07.853298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:07.853545 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:07.853433 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:07.879018 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:07.878964 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-127.ec2.internal" podStartSLOduration=6.878944989 podStartE2EDuration="6.878944989s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:05.91931344 +0000 UTC m=+5.655527274" watchObservedRunningTime="2026-04-17 17:25:07.878944989 +0000 UTC m=+7.615158822" Apr 17 17:25:07.879502 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:07.879387 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ftpnl"] Apr 17 17:25:07.882340 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:07.882319 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:07.882431 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:07.882392 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:07.915314 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:07.915279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:07.915441 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:07.915360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-kubelet-config\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:07.915441 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:07.915392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-dbus\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:08.016377 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:08.016297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-kubelet-config\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:08.016377 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:08.016350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-dbus\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:08.016623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:08.016402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:08.016623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:08.016443 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-kubelet-config\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:08.016623 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:08.016534 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:08.016623 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:08.016605 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret podName:342d0e0b-38a9-4fb2-a76e-aa5459a12a9e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:08.516586961 +0000 UTC m=+8.252800780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret") pod "global-pull-secret-syncer-ftpnl" (UID: "342d0e0b-38a9-4fb2-a76e-aa5459a12a9e") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:08.016623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:08.016607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-dbus\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:08.521250 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:08.521131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:08.521416 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:08.521275 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:08.521416 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:08.521360 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret podName:342d0e0b-38a9-4fb2-a76e-aa5459a12a9e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:09.521340117 +0000 UTC m=+9.257553934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret") pod "global-pull-secret-syncer-ftpnl" (UID: "342d0e0b-38a9-4fb2-a76e-aa5459a12a9e") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:08.853913 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:08.853833 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:08.854061 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:08.853970 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:09.529043 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:09.528981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:09.529599 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:09.529199 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:09.529599 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:09.529265 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret podName:342d0e0b-38a9-4fb2-a76e-aa5459a12a9e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:11.529247554 +0000 UTC m=+11.265461372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret") pod "global-pull-secret-syncer-ftpnl" (UID: "342d0e0b-38a9-4fb2-a76e-aa5459a12a9e") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:09.853505 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:09.853419 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:09.853664 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:09.853419 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:09.853664 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:09.853563 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:09.853664 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:09.853625 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:10.436093 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:10.435935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:10.436297 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:10.436118 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.436297 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:10.436203 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:18.436183813 +0000 UTC m=+18.172397641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:10.537114 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:10.536450 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:10.537114 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:10.536680 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:10.537114 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:10.536700 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:10.537114 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:10.536714 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5s2nn for pod openshift-network-diagnostics/network-check-target-vvm4h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:10.537114 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:10.536772 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn podName:4b83a9e4-5073-4105-bc72-4980376e169f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:18.536754263 +0000 UTC m=+18.272968091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5s2nn" (UniqueName: "kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn") pod "network-check-target-vvm4h" (UID: "4b83a9e4-5073-4105-bc72-4980376e169f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:10.855213 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:10.854719 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:10.855213 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:10.854840 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:11.544326 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:11.544230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:11.544734 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:11.544354 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:11.544734 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:11.544436 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret podName:342d0e0b-38a9-4fb2-a76e-aa5459a12a9e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:15.544416055 +0000 UTC m=+15.280629890 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret") pod "global-pull-secret-syncer-ftpnl" (UID: "342d0e0b-38a9-4fb2-a76e-aa5459a12a9e") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:11.853745 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:11.853671 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:11.853895 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:11.853672 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:11.853895 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:11.853812 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:11.853895 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:11.853882 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:12.853141 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:12.853110 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:12.853555 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:12.853242 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:13.853842 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:13.853800 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:13.854305 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:13.853852 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:13.854305 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:13.853998 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:13.854758 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:13.854508 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:14.853186 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:14.853143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:14.853348 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:14.853293 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:15.574885 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:15.574856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:15.575235 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:15.574995 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:15.575235 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:15.575048 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret podName:342d0e0b-38a9-4fb2-a76e-aa5459a12a9e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:23.575034812 +0000 UTC m=+23.311248629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret") pod "global-pull-secret-syncer-ftpnl" (UID: "342d0e0b-38a9-4fb2-a76e-aa5459a12a9e") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:15.853948 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:15.853854 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:15.854102 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:15.853865 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:15.854102 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:15.853993 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:15.854102 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:15.854059 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:16.853835 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:16.853806 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:16.854235 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:16.853938 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:17.853610 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:17.853575 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:17.853854 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:17.853576 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:17.853854 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:17.853705 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:17.853854 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:17.853805 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:18.494743 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:18.494699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:18.494909 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:18.494865 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:18.494962 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:18.494942 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:34.494924709 +0000 UTC m=+34.231138525 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:18.596050 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:18.596003 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:18.596236 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:18.596159 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:18.596236 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:18.596190 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:18.596236 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:18.596204 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5s2nn for pod openshift-network-diagnostics/network-check-target-vvm4h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:18.596375 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:18.596255 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn podName:4b83a9e4-5073-4105-bc72-4980376e169f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:34.596241872 +0000 UTC m=+34.332455684 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5s2nn" (UniqueName: "kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn") pod "network-check-target-vvm4h" (UID: "4b83a9e4-5073-4105-bc72-4980376e169f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:18.853448 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:18.853408 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:18.853618 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:18.853523 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:19.853804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:19.853769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:19.853804 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:19.853802 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:19.854187 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:19.853876 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:19.854187 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:19.853913 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:20.854323 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.854143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:20.854981 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:20.854387 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:20.930103 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.930065 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d75sg" event={"ID":"02498340-44b9-4152-9802-82fbeecce918","Type":"ContainerStarted","Data":"2d7358bc60c8781810b576bb26f946265a17974bb4513b3d21ef1d4b60e9f6d2"} Apr 17 17:25:20.931524 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.931492 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4dfq5" event={"ID":"d31b25a9-8351-4624-8ef6-a1389bdd2474","Type":"ContainerStarted","Data":"461430bb63f91bc3e08b51e37178173e127445bbaf20a18ab17cac84f1fa84cd"} Apr 17 17:25:20.932803 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.932768 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" event={"ID":"c1a26ef0-b655-4041-9be3-b2b3b545a29f","Type":"ContainerStarted","Data":"a538ff585017244bdaf6bcc5482646102fc3a338afd680d9b0b3a1bd62ddddc8"} Apr 17 17:25:20.934214 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.934188 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5b72d" event={"ID":"807ce854-eb81-42f4-8fb8-0060d033ffbf","Type":"ContainerStarted","Data":"764ef57ac4e6f6649c63ccd54c625b33b34de67e9a29620b17bdd5abc726d774"} Apr 17 17:25:20.935655 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.935631 2574 generic.go:358] "Generic (PLEG): container finished" podID="474e9a38-21a3-415a-a945-80417640d569" containerID="51c7130b4f4bff4ffd4f36b2ffd44ab11dfe632b51b32f3d171a2f619eb218dc" exitCode=0 Apr 17 17:25:20.935757 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.935701 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerDied","Data":"51c7130b4f4bff4ffd4f36b2ffd44ab11dfe632b51b32f3d171a2f619eb218dc"} Apr 17 17:25:20.937137 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.937104 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5wf77" event={"ID":"bdfc917a-4e35-4bac-8c08-84c70e29539e","Type":"ContainerStarted","Data":"8749cbdbf1c26863554a480b6acd66499c9b217041fa6a0cc225e35586338016"} Apr 17 17:25:20.938873 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.938844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nrxzd" event={"ID":"2a664808-7e4f-495a-bd8e-3278a11bb604","Type":"ContainerStarted","Data":"991d958e7840753bc9bbae3f54e4df836fa16a57daf43db4ecfece369637845e"} Apr 17 17:25:20.941583 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.941559 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"000ead36226e274b55f88abe0560170f5dcdead9f63ef582d105393bfd4aa68c"} Apr 17 17:25:20.941648 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.941583 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"99f13fd2bd8a9482c3da8f8514f55d389f5848b258070f9f4033308d43e36d17"} Apr 17 17:25:20.941648 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.941599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"7c356dffa76aaf251ff4fa14ff4ff5cf418b861f2eac56cdf7458914903b38a7"} Apr 17 17:25:20.941648 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.941612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"c043fa5ab617722b8b4c986623c91d19840b78ac56537feb048aa55639595de7"} Apr 17 17:25:20.941648 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.941622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"0932f69d8c9b129622f08ff1229336d8de308974829f8b21b637bee77e1c1ef3"} Apr 17 17:25:20.941648 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.941636 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"673d2b772ae257b2eac6dde862b0b8adb5455ba08da0d0655cc5991b2ebf7026"} Apr 17 17:25:20.988483 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.988397 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d75sg" podStartSLOduration=3.461867773 podStartE2EDuration="19.988382849s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.678772385 +0000 UTC m=+3.414986201" lastFinishedPulling="2026-04-17 17:25:20.205287459 +0000 UTC m=+19.941501277" observedRunningTime="2026-04-17 17:25:20.960320259 +0000 UTC m=+20.696534104" watchObservedRunningTime="2026-04-17 17:25:20.988382849 +0000 UTC m=+20.724596683" Apr 17 17:25:20.988634 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:20.988532 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5b72d" podStartSLOduration=3.116496597 podStartE2EDuration="19.988524885s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.670576304 +0000 UTC m=+3.406790119" lastFinishedPulling="2026-04-17 17:25:20.542604595 +0000 UTC m=+20.278818407" observedRunningTime="2026-04-17 17:25:20.988479281 +0000 UTC m=+20.724693117" watchObservedRunningTime="2026-04-17 17:25:20.988524885 +0000 UTC m=+20.724738720" Apr 17 17:25:21.038675 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.038617 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nrxzd" podStartSLOduration=8.04586901 podStartE2EDuration="20.038598127s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.680832308 +0000 UTC m=+3.417046127" lastFinishedPulling="2026-04-17 17:25:15.673561431 +0000 UTC m=+15.409775244" observedRunningTime="2026-04-17 17:25:21.013633318 +0000 UTC m=+20.749847153" watchObservedRunningTime="2026-04-17 17:25:21.038598127 +0000 UTC m=+20.774811945" Apr 17 17:25:21.064462 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.064420 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5wf77" podStartSLOduration=3.541335046 podStartE2EDuration="20.064407045s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.682230726 +0000 UTC m=+3.418444538" lastFinishedPulling="2026-04-17 17:25:20.205302722 +0000 UTC m=+19.941516537" observedRunningTime="2026-04-17 17:25:21.063935059 +0000 UTC m=+20.800148915" watchObservedRunningTime="2026-04-17 17:25:21.064407045 +0000 UTC m=+20.800620877" Apr 17 17:25:21.085133 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.083752 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4dfq5" podStartSLOduration=3.882094512 podStartE2EDuration="20.083736277s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.677198123 +0000 UTC m=+3.413411938" lastFinishedPulling="2026-04-17 17:25:19.878839879 +0000 UTC m=+19.615053703" observedRunningTime="2026-04-17 17:25:21.082716731 +0000 UTC m=+20.818930564" watchObservedRunningTime="2026-04-17 17:25:21.083736277 +0000 UTC m=+20.819950114" Apr 17 17:25:21.379193 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.379036 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:25:21.821857 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.821762 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:25:21.379185413Z","UUID":"11c9541f-3b29-46cd-b2cf-136daab75c96","Handler":null,"Name":"","Endpoint":""} Apr 17 17:25:21.824537 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.824515 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:25:21.824537 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.824557 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:25:21.852967 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.852916 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:21.852967 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.852941 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:21.853155 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:21.853021 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:21.853314 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:21.853280 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:21.946205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.946152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" event={"ID":"c1a26ef0-b655-4041-9be3-b2b3b545a29f","Type":"ContainerStarted","Data":"6fa5f2f2164c61cee0a69ac42da232bf8ee81558d37b436ee55c129741cd65d4"} Apr 17 17:25:21.947903 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.947873 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wt5ct" event={"ID":"01868bf7-f6d2-461d-8bf1-006126117f62","Type":"ContainerStarted","Data":"354faf8f4f0f3fbda4dcd6269c0f33504ec7269905377924227f898f8240c9ac"} Apr 17 17:25:21.962277 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:21.962226 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wt5ct" podStartSLOduration=5.431802035 podStartE2EDuration="21.962210803s" podCreationTimestamp="2026-04-17 17:25:00 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.674909639 +0000 UTC m=+3.411123457" lastFinishedPulling="2026-04-17 17:25:20.205318398 +0000 UTC m=+19.941532225" observedRunningTime="2026-04-17 17:25:21.961527615 +0000 UTC m=+21.697741820" watchObservedRunningTime="2026-04-17 17:25:21.962210803 +0000 UTC m=+21.698424639" Apr 17 17:25:22.853753 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:22.853718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:22.853908 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:22.853847 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:22.951899 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:22.951869 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"0dc3342ded9419a3245eedf46821932401e8bf12f80f0a6e7d782e4cdad97325"} Apr 17 17:25:22.953741 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:22.953714 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" event={"ID":"c1a26ef0-b655-4041-9be3-b2b3b545a29f","Type":"ContainerStarted","Data":"fbcd7b4afa1aaf53337255e3eb88e86a389438f826519878da19bf113e66e088"} Apr 17 17:25:22.971306 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:22.971266 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ld9sz" podStartSLOduration=3.394833169 podStartE2EDuration="21.971252872s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.676704178 +0000 UTC m=+3.412918002" lastFinishedPulling="2026-04-17 17:25:22.253123881 +0000 UTC m=+21.989337705" observedRunningTime="2026-04-17 17:25:22.971065744 +0000 UTC m=+22.707279578" watchObservedRunningTime="2026-04-17 17:25:22.971252872 +0000 UTC m=+22.707466703" Apr 17 17:25:23.221932 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:23.221842 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:23.222482 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:23.222464 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:23.634478 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:23.634440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:23.634657 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:23.634599 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:23.634714 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:23.634673 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret podName:342d0e0b-38a9-4fb2-a76e-aa5459a12a9e nodeName:}" failed. No retries permitted until 2026-04-17 17:25:39.634658035 +0000 UTC m=+39.370871987 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret") pod "global-pull-secret-syncer-ftpnl" (UID: "342d0e0b-38a9-4fb2-a76e-aa5459a12a9e") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:25:23.853158 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:23.853123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:23.853158 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:23.853163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:23.853383 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:23.853281 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:23.853435 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:23.853392 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:23.956600 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:23.956505 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:23.957218 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:23.957006 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nrxzd" Apr 17 17:25:24.853459 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:24.853417 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:24.853636 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:24.853559 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:25.853464 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:25.853282 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:25.854052 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:25.853282 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:25.854052 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:25.853552 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:25.854052 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:25.853594 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:25.962324 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:25.962287 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" event={"ID":"110a4c18-b7af-4bb1-8f5e-f332eb485ccb","Type":"ContainerStarted","Data":"5bcb95a26ea4dc6977cc2466074bd40316929ccae09e68abe85636ce9dd4952c"} Apr 17 17:25:25.962546 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:25.962527 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:25.963995 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:25.963971 2574 generic.go:358] "Generic (PLEG): container finished" podID="474e9a38-21a3-415a-a945-80417640d569" containerID="6057b50df2aae509db3424134759b552fb7d838f10431381bf009f05048cf721" exitCode=0 Apr 17 17:25:25.964113 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:25.963998 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerDied","Data":"6057b50df2aae509db3424134759b552fb7d838f10431381bf009f05048cf721"} Apr 17 17:25:25.977368 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:25.977346 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:26.001799 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:26.001758 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" podStartSLOduration=8.398445168 podStartE2EDuration="25.001744997s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.679675724 +0000 UTC m=+3.415889539" lastFinishedPulling="2026-04-17 17:25:20.282975539 +0000 UTC m=+20.019189368" observedRunningTime="2026-04-17 17:25:25.992765137 +0000 UTC m=+25.728978981" watchObservedRunningTime="2026-04-17 17:25:26.001744997 +0000 UTC m=+25.737958831" Apr 17 17:25:26.853900 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:26.853871 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:26.854406 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:26.853988 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:26.968489 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:26.968149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerStarted","Data":"f4a4718d17e1602d117358823ec6016db1f60e5a94d22e0ff3164f10f477b5f4"} Apr 17 17:25:26.968489 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:26.968313 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:25:26.969053 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:26.968924 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:26.987729 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:26.987702 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:27.225078 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.225003 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ftpnl"] Apr 17 17:25:27.225222 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.225126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:27.225268 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:27.225250 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:27.230075 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.230040 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cqtr2"] Apr 17 17:25:27.230214 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.230142 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:27.230275 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:27.230256 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:27.230967 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.230947 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vvm4h"] Apr 17 17:25:27.231063 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.231025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:27.231107 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:27.231090 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:27.971801 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.971767 2574 generic.go:358] "Generic (PLEG): container finished" podID="474e9a38-21a3-415a-a945-80417640d569" containerID="f4a4718d17e1602d117358823ec6016db1f60e5a94d22e0ff3164f10f477b5f4" exitCode=0 Apr 17 17:25:27.972190 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.971819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerDied","Data":"f4a4718d17e1602d117358823ec6016db1f60e5a94d22e0ff3164f10f477b5f4"} Apr 17 17:25:27.972190 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:27.972051 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:25:28.853333 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:28.853304 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:28.853503 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:28.853308 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:28.853503 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:28.853405 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:28.853610 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:28.853496 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:28.853610 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:28.853555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:28.853698 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:28.853615 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:28.975696 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:28.975596 2574 generic.go:358] "Generic (PLEG): container finished" podID="474e9a38-21a3-415a-a945-80417640d569" containerID="6912d3c5e766573d44443ea92ce3de46a37e093ed7cbd9533632d22eb52dfdb5" exitCode=0 Apr 17 17:25:28.975696 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:28.975678 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerDied","Data":"6912d3c5e766573d44443ea92ce3de46a37e093ed7cbd9533632d22eb52dfdb5"} Apr 17 17:25:28.976080 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:28.975941 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:25:30.854100 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:30.854067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:30.854921 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:30.854156 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:30.854921 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:30.854214 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:30.854921 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:30.854219 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:30.854921 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:30.854292 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:30.854921 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:30.854406 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:31.759779 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:31.759742 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:31.760009 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:31.759993 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 17:25:31.774772 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:31.774747 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sgzx2" Apr 17 17:25:32.853559 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:32.853528 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:32.854009 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:32.853528 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:32.854009 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:32.853650 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ftpnl" podUID="342d0e0b-38a9-4fb2-a76e-aa5459a12a9e" Apr 17 17:25:32.854009 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:32.853528 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:32.854009 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:32.853714 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:25:32.854009 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:32.853823 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vvm4h" podUID="4b83a9e4-5073-4105-bc72-4980376e169f" Apr 17 17:25:33.113188 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.113095 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-127.ec2.internal" event="NodeReady" Apr 17 17:25:33.113329 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.113252 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:25:33.150569 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.150529 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p"] Apr 17 17:25:33.183352 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.182902 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6"] Apr 17 17:25:33.183352 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.183234 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" Apr 17 17:25:33.186068 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.186042 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:25:33.186218 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.186042 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-zc8pm\"" Apr 17 17:25:33.186477 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.186456 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:25:33.186595 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.186463 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:25:33.186595 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.186574 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 17:25:33.207656 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.207632 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-fc76bc7f-f8tx7"] Apr 17 17:25:33.207813 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.207793 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.210410 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.210387 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 17:25:33.228993 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.228972 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p"] Apr 17 17:25:33.229120 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.229074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.231810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.231753 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:25:33.231899 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.231832 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hlc25\"" Apr 17 17:25:33.232050 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.232034 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:25:33.232186 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.232078 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:25:33.248545 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.248520 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6"] Apr 17 17:25:33.248667 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.248551 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p"] Apr 17 17:25:33.248667 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.248568 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p"] Apr 17 17:25:33.248667 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.248579 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-fc76bc7f-f8tx7"] Apr 17 17:25:33.248667 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.248593 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-g8tgx"] Apr 17 17:25:33.248978 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.248958 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.250622 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.250605 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:25:33.251674 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.251656 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 17:25:33.251674 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.251669 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 17:25:33.251808 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.251797 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 17:25:33.251863 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.251806 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 17:25:33.269402 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.269307 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9mzwz"] Apr 17 17:25:33.269529 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.269431 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.272722 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.272700 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:25:33.272828 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.272727 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:25:33.272828 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.272755 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w4qb9\"" Apr 17 17:25:33.284669 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.284643 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9mzwz"] Apr 17 17:25:33.284774 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.284670 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:33.284774 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.284689 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g8tgx"] Apr 17 17:25:33.287360 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.287333 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:25:33.287460 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.287363 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:25:33.287460 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.287444 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fhhs9\"" Apr 17 17:25:33.287460 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.287456 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:25:33.305531 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.305506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3ed62d4-67a6-4ebf-939c-d3a20b609d91-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-846b6797f7-bmc2p\" (UID: \"c3ed62d4-67a6-4ebf-939c-d3a20b609d91\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" Apr 17 17:25:33.305640 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.305550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cc90e67-acf5-4042-8e41-ae5f86e4249d-tmp\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.305640 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.305579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nsht\" (UniqueName: \"kubernetes.io/projected/c3ed62d4-67a6-4ebf-939c-d3a20b609d91-kube-api-access-7nsht\") pod \"managed-serviceaccount-addon-agent-846b6797f7-bmc2p\" (UID: \"c3ed62d4-67a6-4ebf-939c-d3a20b609d91\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" Apr 17 17:25:33.305640 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.305607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3cc90e67-acf5-4042-8e41-ae5f86e4249d-klusterlet-config\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.305737 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.305704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzgtn\" (UniqueName: \"kubernetes.io/projected/3cc90e67-acf5-4042-8e41-ae5f86e4249d-kube-api-access-qzgtn\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.407054 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.406965 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-certificates\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.407054 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407006 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-installation-pull-secrets\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.407054 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-image-registry-private-configuration\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407076 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs2hc\" (UniqueName: \"kubernetes.io/projected/a5a0550e-4a4c-4a4b-841e-64468d8467ce-kube-api-access-bs2hc\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbhnk\" (UniqueName: \"kubernetes.io/projected/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-kube-api-access-hbhnk\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-ca\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c886c27b-9257-4fdc-b97a-21a2d21fe963-ca-trust-extracted\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407183 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-trusted-ca\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rphf\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-kube-api-access-5rphf\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-tmp-dir\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzgtn\" (UniqueName: \"kubernetes.io/projected/3cc90e67-acf5-4042-8e41-ae5f86e4249d-kube-api-access-qzgtn\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.407336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407286 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-bound-sa-token\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407447 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-config-volume\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-hub\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407512 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407544 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407586 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3ed62d4-67a6-4ebf-939c-d3a20b609d91-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-846b6797f7-bmc2p\" (UID: \"c3ed62d4-67a6-4ebf-939c-d3a20b609d91\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407689 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407711 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407730 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvvk\" (UniqueName: \"kubernetes.io/projected/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-kube-api-access-dpvvk\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cc90e67-acf5-4042-8e41-ae5f86e4249d-tmp\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.407810 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nsht\" (UniqueName: \"kubernetes.io/projected/c3ed62d4-67a6-4ebf-939c-d3a20b609d91-kube-api-access-7nsht\") pod \"managed-serviceaccount-addon-agent-846b6797f7-bmc2p\" (UID: \"c3ed62d4-67a6-4ebf-939c-d3a20b609d91\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" Apr 17 17:25:33.408263 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.407905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3cc90e67-acf5-4042-8e41-ae5f86e4249d-klusterlet-config\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.408263 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.408137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3cc90e67-acf5-4042-8e41-ae5f86e4249d-tmp\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.413232 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.413082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3ed62d4-67a6-4ebf-939c-d3a20b609d91-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-846b6797f7-bmc2p\" (UID: \"c3ed62d4-67a6-4ebf-939c-d3a20b609d91\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" Apr 17 17:25:33.413327 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.413082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3cc90e67-acf5-4042-8e41-ae5f86e4249d-klusterlet-config\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.417291 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.417263 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nsht\" (UniqueName: \"kubernetes.io/projected/c3ed62d4-67a6-4ebf-939c-d3a20b609d91-kube-api-access-7nsht\") pod \"managed-serviceaccount-addon-agent-846b6797f7-bmc2p\" (UID: \"c3ed62d4-67a6-4ebf-939c-d3a20b609d91\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" Apr 17 17:25:33.417465 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.417415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzgtn\" (UniqueName: \"kubernetes.io/projected/3cc90e67-acf5-4042-8e41-ae5f86e4249d-kube-api-access-qzgtn\") pod \"klusterlet-addon-workmgr-b75f6f878-d98w6\" (UID: \"3cc90e67-acf5-4042-8e41-ae5f86e4249d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.503274 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.503225 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" Apr 17 17:25:33.508214 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-installation-pull-secrets\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.508305 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508243 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-image-registry-private-configuration\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.508305 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs2hc\" (UniqueName: \"kubernetes.io/projected/a5a0550e-4a4c-4a4b-841e-64468d8467ce-kube-api-access-bs2hc\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:33.508422 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbhnk\" (UniqueName: \"kubernetes.io/projected/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-kube-api-access-hbhnk\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.508422 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-ca\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.508422 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c886c27b-9257-4fdc-b97a-21a2d21fe963-ca-trust-extracted\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.508422 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-trusted-ca\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.508422 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rphf\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-kube-api-access-5rphf\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.508653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-tmp-dir\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.508653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:33.508653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-bound-sa-token\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.508653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-config-volume\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.508653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-hub\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.508653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.508653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.508976 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508755 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c886c27b-9257-4fdc-b97a-21a2d21fe963-ca-trust-extracted\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.508976 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.508976 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.508976 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.508976 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.508967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvvk\" (UniqueName: \"kubernetes.io/projected/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-kube-api-access-dpvvk\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.509297 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.509028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-certificates\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.509297 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.509053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-tmp-dir\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.509297 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:33.509147 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:33.509297 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:33.509162 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fc76bc7f-f8tx7: secret "image-registry-tls" not found Apr 17 17:25:33.509477 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:33.509419 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:33.509477 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.509438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-trusted-ca\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.509477 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:33.509472 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls podName:cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:34.009453968 +0000 UTC m=+33.745667786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls") pod "dns-default-g8tgx" (UID: "cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01") : secret "dns-default-metrics-tls" not found Apr 17 17:25:33.509623 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.509420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.509623 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:33.509521 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls podName:c886c27b-9257-4fdc-b97a-21a2d21fe963 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:34.009511925 +0000 UTC m=+33.745725754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls") pod "image-registry-fc76bc7f-f8tx7" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963") : secret "image-registry-tls" not found Apr 17 17:25:33.509623 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:33.509589 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:33.509623 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:33.509620 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert podName:a5a0550e-4a4c-4a4b-841e-64468d8467ce nodeName:}" failed. No retries permitted until 2026-04-17 17:25:34.009610164 +0000 UTC m=+33.745823982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert") pod "ingress-canary-9mzwz" (UID: "a5a0550e-4a4c-4a4b-841e-64468d8467ce") : secret "canary-serving-cert" not found Apr 17 17:25:33.509869 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.509721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-certificates\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.509955 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.509898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-config-volume\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.511570 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.511551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-image-registry-private-configuration\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.511663 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.511553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-ca\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.511663 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.511609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-hub\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.512118 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.512098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-installation-pull-secrets\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.512291 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.512273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.512642 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.512624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.517445 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.517385 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:33.520837 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.520787 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs2hc\" (UniqueName: \"kubernetes.io/projected/a5a0550e-4a4c-4a4b-841e-64468d8467ce-kube-api-access-bs2hc\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:33.522629 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.522608 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvvk\" (UniqueName: \"kubernetes.io/projected/d6aaf1d4-521f-41d7-95d8-679d0d7827e2-kube-api-access-dpvvk\") pod \"cluster-proxy-proxy-agent-858ff96b46-l8t4p\" (UID: \"d6aaf1d4-521f-41d7-95d8-679d0d7827e2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:33.523152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.523098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rphf\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-kube-api-access-5rphf\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.523152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.523114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-bound-sa-token\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:33.523328 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.523309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbhnk\" (UniqueName: \"kubernetes.io/projected/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-kube-api-access-hbhnk\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:33.558633 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:33.558604 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:25:34.012927 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.012883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.012986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.013043 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.013052 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.013111 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls podName:cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:35.013097059 +0000 UTC m=+34.749310871 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls") pod "dns-default-g8tgx" (UID: "cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01") : secret "dns-default-metrics-tls" not found Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.013136 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.013146 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.013158 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fc76bc7f-f8tx7: secret "image-registry-tls" not found Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.013210 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert podName:a5a0550e-4a4c-4a4b-841e-64468d8467ce nodeName:}" failed. No retries permitted until 2026-04-17 17:25:35.013192372 +0000 UTC m=+34.749406190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert") pod "ingress-canary-9mzwz" (UID: "a5a0550e-4a4c-4a4b-841e-64468d8467ce") : secret "canary-serving-cert" not found Apr 17 17:25:34.013355 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.013231 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls podName:c886c27b-9257-4fdc-b97a-21a2d21fe963 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:35.01322091 +0000 UTC m=+34.749434724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls") pod "image-registry-fc76bc7f-f8tx7" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963") : secret "image-registry-tls" not found Apr 17 17:25:34.516657 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.516625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:34.516813 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.516757 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:34.516878 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.516839 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:06.516820524 +0000 UTC m=+66.253034355 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:25:34.617645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.617594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:34.617834 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.617788 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:25:34.617834 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.617818 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:25:34.617834 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.617837 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5s2nn for pod openshift-network-diagnostics/network-check-target-vvm4h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:34.617980 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:34.617900 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn podName:4b83a9e4-5073-4105-bc72-4980376e169f nodeName:}" failed. No retries permitted until 2026-04-17 17:26:06.617882433 +0000 UTC m=+66.354096246 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5s2nn" (UniqueName: "kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn") pod "network-check-target-vvm4h" (UID: "4b83a9e4-5073-4105-bc72-4980376e169f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:25:34.853674 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.853610 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:25:34.853799 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.853610 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:34.853799 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.853610 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:25:34.856406 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.856383 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:34.856509 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.856407 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:34.857553 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.857535 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:25:34.857649 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.857550 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wl5wm\"" Apr 17 17:25:34.857649 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.857577 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:34.857649 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:34.857532 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lkqvr\"" Apr 17 17:25:35.025663 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.024116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:35.025663 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:35.024295 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:35.025663 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:35.025316 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert podName:a5a0550e-4a4c-4a4b-841e-64468d8467ce nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.025294286 +0000 UTC m=+36.761508113 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert") pod "ingress-canary-9mzwz" (UID: "a5a0550e-4a4c-4a4b-841e-64468d8467ce") : secret "canary-serving-cert" not found Apr 17 17:25:35.025663 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.025397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:35.025663 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.025430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:35.025663 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:35.025545 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:35.025663 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:35.025590 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls podName:cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.025577752 +0000 UTC m=+36.761791564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls") pod "dns-default-g8tgx" (UID: "cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01") : secret "dns-default-metrics-tls" not found Apr 17 17:25:35.026922 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:35.026837 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:35.026922 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:35.026856 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fc76bc7f-f8tx7: secret "image-registry-tls" not found Apr 17 17:25:35.026922 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:35.026900 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls podName:c886c27b-9257-4fdc-b97a-21a2d21fe963 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:37.026886025 +0000 UTC m=+36.763099837 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls") pod "image-registry-fc76bc7f-f8tx7" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963") : secret "image-registry-tls" not found Apr 17 17:25:35.112221 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.112143 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p"] Apr 17 17:25:35.115708 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.115685 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p"] Apr 17 17:25:35.123776 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.123754 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6"] Apr 17 17:25:35.207560 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:35.207525 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3ed62d4_67a6_4ebf_939c_d3a20b609d91.slice/crio-f62d83c50c795e1e022ea1db79750f204a238766d78f1ca2074dabbfb6f85572 WatchSource:0}: Error finding container f62d83c50c795e1e022ea1db79750f204a238766d78f1ca2074dabbfb6f85572: Status 404 returned error can't find the container with id f62d83c50c795e1e022ea1db79750f204a238766d78f1ca2074dabbfb6f85572 Apr 17 17:25:35.208342 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:35.208311 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6aaf1d4_521f_41d7_95d8_679d0d7827e2.slice/crio-35d26dd800e869d557a1a6007950bd5238fcbcdd645650fcc615a2dfed81b90a WatchSource:0}: Error finding container 35d26dd800e869d557a1a6007950bd5238fcbcdd645650fcc615a2dfed81b90a: Status 404 returned error can't find the container with id 35d26dd800e869d557a1a6007950bd5238fcbcdd645650fcc615a2dfed81b90a Apr 17 17:25:35.209017 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:35.208993 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cc90e67_acf5_4042_8e41_ae5f86e4249d.slice/crio-647bb578e47f8624a3e28ddbbdc220006196c3f2b5f7fe587401f41356ac49f4 WatchSource:0}: Error finding container 647bb578e47f8624a3e28ddbbdc220006196c3f2b5f7fe587401f41356ac49f4: Status 404 returned error can't find the container with id 647bb578e47f8624a3e28ddbbdc220006196c3f2b5f7fe587401f41356ac49f4 Apr 17 17:25:35.995866 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.995737 2574 generic.go:358] "Generic (PLEG): container finished" podID="474e9a38-21a3-415a-a945-80417640d569" containerID="63f6048c1d299ce2376f6ec30fb3c5e2a65cdc80f52823f0a0028f77f1d4df8c" exitCode=0 Apr 17 17:25:35.995866 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.995822 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerDied","Data":"63f6048c1d299ce2376f6ec30fb3c5e2a65cdc80f52823f0a0028f77f1d4df8c"} Apr 17 17:25:35.998061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:35.998034 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" event={"ID":"d6aaf1d4-521f-41d7-95d8-679d0d7827e2","Type":"ContainerStarted","Data":"35d26dd800e869d557a1a6007950bd5238fcbcdd645650fcc615a2dfed81b90a"} Apr 17 17:25:36.000619 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:36.000543 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" event={"ID":"c3ed62d4-67a6-4ebf-939c-d3a20b609d91","Type":"ContainerStarted","Data":"f62d83c50c795e1e022ea1db79750f204a238766d78f1ca2074dabbfb6f85572"} Apr 17 17:25:36.002822 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:36.002785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" event={"ID":"3cc90e67-acf5-4042-8e41-ae5f86e4249d","Type":"ContainerStarted","Data":"647bb578e47f8624a3e28ddbbdc220006196c3f2b5f7fe587401f41356ac49f4"} Apr 17 17:25:37.012653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:37.011993 2574 generic.go:358] "Generic (PLEG): container finished" podID="474e9a38-21a3-415a-a945-80417640d569" containerID="bb32fd29e927e58093de9918785619f944a3b0a656142712dc34b1f937e299d7" exitCode=0 Apr 17 17:25:37.012653 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:37.012338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerDied","Data":"bb32fd29e927e58093de9918785619f944a3b0a656142712dc34b1f937e299d7"} Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:37.045737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:37.045786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:37.045869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:37.045981 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:37.046015 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:37.046030 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fc76bc7f-f8tx7: secret "image-registry-tls" not found Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:37.046039 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert podName:a5a0550e-4a4c-4a4b-841e-64468d8467ce nodeName:}" failed. No retries permitted until 2026-04-17 17:25:41.046021644 +0000 UTC m=+40.782235479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert") pod "ingress-canary-9mzwz" (UID: "a5a0550e-4a4c-4a4b-841e-64468d8467ce") : secret "canary-serving-cert" not found Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:37.046073 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls podName:c886c27b-9257-4fdc-b97a-21a2d21fe963 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:41.046063441 +0000 UTC m=+40.782277265 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls") pod "image-registry-fc76bc7f-f8tx7" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963") : secret "image-registry-tls" not found Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:37.045981 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:37.046435 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:37.046104 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls podName:cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:41.046095676 +0000 UTC m=+40.782309500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls") pod "dns-default-g8tgx" (UID: "cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01") : secret "dns-default-metrics-tls" not found Apr 17 17:25:39.666022 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:39.665976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:39.670249 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:39.670223 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/342d0e0b-38a9-4fb2-a76e-aa5459a12a9e-original-pull-secret\") pod \"global-pull-secret-syncer-ftpnl\" (UID: \"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e\") " pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:39.968157 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:39.968062 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ftpnl" Apr 17 17:25:41.074753 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:41.074711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:41.074801 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:41.074830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:41.074872 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:41.074902 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:41.074918 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fc76bc7f-f8tx7: secret "image-registry-tls" not found Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:41.074922 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:41.074937 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert podName:a5a0550e-4a4c-4a4b-841e-64468d8467ce nodeName:}" failed. No retries permitted until 2026-04-17 17:25:49.074922943 +0000 UTC m=+48.811136778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert") pod "ingress-canary-9mzwz" (UID: "a5a0550e-4a4c-4a4b-841e-64468d8467ce") : secret "canary-serving-cert" not found Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:41.074970 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls podName:c886c27b-9257-4fdc-b97a-21a2d21fe963 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:49.074956851 +0000 UTC m=+48.811170664 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls") pod "image-registry-fc76bc7f-f8tx7" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963") : secret "image-registry-tls" not found Apr 17 17:25:41.075130 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:41.074988 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls podName:cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:49.07497589 +0000 UTC m=+48.811189710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls") pod "dns-default-g8tgx" (UID: "cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01") : secret "dns-default-metrics-tls" not found Apr 17 17:25:41.224758 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:41.224735 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ftpnl"] Apr 17 17:25:41.226299 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:25:41.226277 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod342d0e0b_38a9_4fb2_a76e_aa5459a12a9e.slice/crio-279218370c70bfac4ac4095ef3ca5c089c74d5b7db63a70561c16e36eb062acb WatchSource:0}: Error finding container 279218370c70bfac4ac4095ef3ca5c089c74d5b7db63a70561c16e36eb062acb: Status 404 returned error can't find the container with id 279218370c70bfac4ac4095ef3ca5c089c74d5b7db63a70561c16e36eb062acb Apr 17 17:25:42.023972 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.023935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ftpnl" event={"ID":"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e","Type":"ContainerStarted","Data":"279218370c70bfac4ac4095ef3ca5c089c74d5b7db63a70561c16e36eb062acb"} Apr 17 17:25:42.028237 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.028206 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cf28x" event={"ID":"474e9a38-21a3-415a-a945-80417640d569","Type":"ContainerStarted","Data":"c30679ed4b4baaf581bdb107db4600e5e1ef53ab8faa40f2691c8e13bcf99526"} Apr 17 17:25:42.029853 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.029816 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" event={"ID":"d6aaf1d4-521f-41d7-95d8-679d0d7827e2","Type":"ContainerStarted","Data":"cbdf454b2f59ddf815652259c0ca87f0e87c8f34374c124daddd73abeca3bb51"} Apr 17 17:25:42.031380 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.031359 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" event={"ID":"c3ed62d4-67a6-4ebf-939c-d3a20b609d91","Type":"ContainerStarted","Data":"226bc49b629767e9845f743025783a69639934d4574935145b38317667fa7a78"} Apr 17 17:25:42.033141 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.033088 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" event={"ID":"3cc90e67-acf5-4042-8e41-ae5f86e4249d","Type":"ContainerStarted","Data":"4710e283344af990331a660ded52ef43d5c57c8ff366afa58d39ab0f92fa8cd7"} Apr 17 17:25:42.033389 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.033355 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:42.035313 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.035290 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:25:42.055210 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.055144 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cf28x" podStartSLOduration=9.48216254 podStartE2EDuration="41.055127801s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:25:03.665823781 +0000 UTC m=+3.402037597" lastFinishedPulling="2026-04-17 17:25:35.238789033 +0000 UTC m=+34.975002858" observedRunningTime="2026-04-17 17:25:42.05237111 +0000 UTC m=+41.788584945" watchObservedRunningTime="2026-04-17 17:25:42.055127801 +0000 UTC m=+41.791341639" Apr 17 17:25:42.072424 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.072385 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" podStartSLOduration=6.170933369 podStartE2EDuration="12.072369464s" podCreationTimestamp="2026-04-17 17:25:30 +0000 UTC" firstStartedPulling="2026-04-17 17:25:35.215120947 +0000 UTC m=+34.951334759" lastFinishedPulling="2026-04-17 17:25:41.116557026 +0000 UTC m=+40.852770854" observedRunningTime="2026-04-17 17:25:42.071679074 +0000 UTC m=+41.807892910" watchObservedRunningTime="2026-04-17 17:25:42.072369464 +0000 UTC m=+41.808583300" Apr 17 17:25:42.088796 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:42.088759 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" podStartSLOduration=6.203061169 podStartE2EDuration="12.088747404s" podCreationTimestamp="2026-04-17 17:25:30 +0000 UTC" firstStartedPulling="2026-04-17 17:25:35.215417294 +0000 UTC m=+34.951631121" lastFinishedPulling="2026-04-17 17:25:41.10110354 +0000 UTC m=+40.837317356" observedRunningTime="2026-04-17 17:25:42.087075589 +0000 UTC m=+41.823289423" watchObservedRunningTime="2026-04-17 17:25:42.088747404 +0000 UTC m=+41.824961238" Apr 17 17:25:46.044352 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:46.044252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ftpnl" event={"ID":"342d0e0b-38a9-4fb2-a76e-aa5459a12a9e","Type":"ContainerStarted","Data":"f35c58678b9c0e2259511c72dec0cae87363dc6b0395c4959f935d9625d5afc6"} Apr 17 17:25:46.046043 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:46.046014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" event={"ID":"d6aaf1d4-521f-41d7-95d8-679d0d7827e2","Type":"ContainerStarted","Data":"1414c5bbd04ba3f8d7ea13743434c30795ffb9c4823607b8e11f6c01d11148b5"} Apr 17 17:25:46.046043 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:46.046044 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" event={"ID":"d6aaf1d4-521f-41d7-95d8-679d0d7827e2","Type":"ContainerStarted","Data":"c45d4d6da2d4adfe30057ec285c961647a5445eda930ab5ba38c46ed089e3720"} Apr 17 17:25:46.060256 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:46.060215 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ftpnl" podStartSLOduration=34.532872763 podStartE2EDuration="39.060203463s" podCreationTimestamp="2026-04-17 17:25:07 +0000 UTC" firstStartedPulling="2026-04-17 17:25:41.22795065 +0000 UTC m=+40.964164465" lastFinishedPulling="2026-04-17 17:25:45.755281353 +0000 UTC m=+45.491495165" observedRunningTime="2026-04-17 17:25:46.059979921 +0000 UTC m=+45.796193755" watchObservedRunningTime="2026-04-17 17:25:46.060203463 +0000 UTC m=+45.796417297" Apr 17 17:25:46.077525 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:46.077483 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" podStartSLOduration=5.9237837859999996 podStartE2EDuration="16.077472254s" podCreationTimestamp="2026-04-17 17:25:30 +0000 UTC" firstStartedPulling="2026-04-17 17:25:35.215298597 +0000 UTC m=+34.951512423" lastFinishedPulling="2026-04-17 17:25:45.368987064 +0000 UTC m=+45.105200891" observedRunningTime="2026-04-17 17:25:46.076948315 +0000 UTC m=+45.813162170" watchObservedRunningTime="2026-04-17 17:25:46.077472254 +0000 UTC m=+45.813686087" Apr 17 17:25:49.136306 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:49.136269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:49.136328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:25:49.136348 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:49.136414 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:49.136425 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:49.136482 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls podName:cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:05.136469898 +0000 UTC m=+64.872683711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls") pod "dns-default-g8tgx" (UID: "cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01") : secret "dns-default-metrics-tls" not found Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:49.136486 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:49.136502 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fc76bc7f-f8tx7: secret "image-registry-tls" not found Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:49.136549 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert podName:a5a0550e-4a4c-4a4b-841e-64468d8467ce nodeName:}" failed. No retries permitted until 2026-04-17 17:26:05.136536069 +0000 UTC m=+64.872749886 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert") pod "ingress-canary-9mzwz" (UID: "a5a0550e-4a4c-4a4b-841e-64468d8467ce") : secret "canary-serving-cert" not found Apr 17 17:25:49.136684 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:25:49.136571 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls podName:c886c27b-9257-4fdc-b97a-21a2d21fe963 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:05.136563955 +0000 UTC m=+64.872777767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls") pod "image-registry-fc76bc7f-f8tx7" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963") : secret "image-registry-tls" not found Apr 17 17:26:05.146425 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:05.146382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:05.146455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:05.146483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:05.146538 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:05.146596 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:05.146613 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fc76bc7f-f8tx7: secret "image-registry-tls" not found Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:05.146618 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:05.146606 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert podName:a5a0550e-4a4c-4a4b-841e-64468d8467ce nodeName:}" failed. No retries permitted until 2026-04-17 17:26:37.146590465 +0000 UTC m=+96.882804278 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert") pod "ingress-canary-9mzwz" (UID: "a5a0550e-4a4c-4a4b-841e-64468d8467ce") : secret "canary-serving-cert" not found Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:05.146674 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls podName:c886c27b-9257-4fdc-b97a-21a2d21fe963 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:37.146660482 +0000 UTC m=+96.882874294 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls") pod "image-registry-fc76bc7f-f8tx7" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963") : secret "image-registry-tls" not found Apr 17 17:26:05.146998 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:05.146690 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls podName:cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:37.146684694 +0000 UTC m=+96.882898507 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls") pod "dns-default-g8tgx" (UID: "cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01") : secret "dns-default-metrics-tls" not found Apr 17 17:26:06.556956 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:06.556915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:26:06.560070 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:06.560052 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:26:06.567613 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:06.567596 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:26:06.567659 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:06.567649 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:10.567633939 +0000 UTC m=+130.303847751 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : secret "metrics-daemon-secret" not found Apr 17 17:26:06.657853 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:06.657818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:26:06.660789 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:06.660768 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:26:06.671410 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:06.671391 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:26:06.681702 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:06.681677 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2nn\" (UniqueName: \"kubernetes.io/projected/4b83a9e4-5073-4105-bc72-4980376e169f-kube-api-access-5s2nn\") pod \"network-check-target-vvm4h\" (UID: \"4b83a9e4-5073-4105-bc72-4980376e169f\") " pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:26:06.975588 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:06.975513 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wl5wm\"" Apr 17 17:26:06.982848 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:06.982823 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:26:07.106605 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:07.106578 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vvm4h"] Apr 17 17:26:07.109949 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:26:07.109925 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b83a9e4_5073_4105_bc72_4980376e169f.slice/crio-baaac5f460dee7e2f49edfaadeaa2138a9374e1bf9d11a334b9597d41108d2ab WatchSource:0}: Error finding container baaac5f460dee7e2f49edfaadeaa2138a9374e1bf9d11a334b9597d41108d2ab: Status 404 returned error can't find the container with id baaac5f460dee7e2f49edfaadeaa2138a9374e1bf9d11a334b9597d41108d2ab Apr 17 17:26:08.099947 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:08.099911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vvm4h" event={"ID":"4b83a9e4-5073-4105-bc72-4980376e169f","Type":"ContainerStarted","Data":"baaac5f460dee7e2f49edfaadeaa2138a9374e1bf9d11a334b9597d41108d2ab"} Apr 17 17:26:11.110722 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:11.110687 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vvm4h" event={"ID":"4b83a9e4-5073-4105-bc72-4980376e169f","Type":"ContainerStarted","Data":"a838565c4499d4e4f3630d4d4e13e39f95d68f6f454ef887e60fdeca1a995c6d"} Apr 17 17:26:11.111212 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:11.110846 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:26:11.126609 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:11.126546 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vvm4h" podStartSLOduration=67.893878898 podStartE2EDuration="1m11.126534454s" podCreationTimestamp="2026-04-17 17:25:00 +0000 UTC" firstStartedPulling="2026-04-17 17:26:07.111645188 +0000 UTC m=+66.847859001" lastFinishedPulling="2026-04-17 17:26:10.344300744 +0000 UTC m=+70.080514557" observedRunningTime="2026-04-17 17:26:11.126064663 +0000 UTC m=+70.862278497" watchObservedRunningTime="2026-04-17 17:26:11.126534454 +0000 UTC m=+70.862748278" Apr 17 17:26:37.183012 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:37.182913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:26:37.183012 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:37.182987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:26:37.183470 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:37.183017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:26:37.183470 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:37.183062 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:37.183470 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:37.183127 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:26:37.183470 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:37.183143 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fc76bc7f-f8tx7: secret "image-registry-tls" not found Apr 17 17:26:37.183470 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:37.183132 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert podName:a5a0550e-4a4c-4a4b-841e-64468d8467ce nodeName:}" failed. No retries permitted until 2026-04-17 17:27:41.183112514 +0000 UTC m=+160.919326327 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert") pod "ingress-canary-9mzwz" (UID: "a5a0550e-4a4c-4a4b-841e-64468d8467ce") : secret "canary-serving-cert" not found Apr 17 17:26:37.183470 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:37.183202 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:37.183470 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:37.183223 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls podName:c886c27b-9257-4fdc-b97a-21a2d21fe963 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:41.183205352 +0000 UTC m=+160.919419165 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls") pod "image-registry-fc76bc7f-f8tx7" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963") : secret "image-registry-tls" not found Apr 17 17:26:37.183470 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:26:37.183250 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls podName:cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:41.183236997 +0000 UTC m=+160.919450821 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls") pod "dns-default-g8tgx" (UID: "cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01") : secret "dns-default-metrics-tls" not found Apr 17 17:26:42.115051 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:26:42.115013 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vvm4h" Apr 17 17:27:06.476561 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:06.476527 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4dfq5_d31b25a9-8351-4624-8ef6-a1389bdd2474/dns-node-resolver/0.log" Apr 17 17:27:07.476205 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:07.476164 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d75sg_02498340-44b9-4152-9802-82fbeecce918/node-ca/0.log" Apr 17 17:27:10.627401 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:10.627362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:27:10.627768 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:10.627515 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:27:10.627768 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:10.627588 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs podName:f3033f4c-b4a1-45de-8f08-0fbf65425c86 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:12.627573493 +0000 UTC m=+252.363787305 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs") pod "network-metrics-daemon-cqtr2" (UID: "f3033f4c-b4a1-45de-8f08-0fbf65425c86") : secret "metrics-daemon-secret" not found Apr 17 17:27:36.238347 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:36.238306 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" podUID="c886c27b-9257-4fdc-b97a-21a2d21fe963" Apr 17 17:27:36.296420 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.296391 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:27:36.297786 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:36.297764 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-g8tgx" podUID="cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01" Apr 17 17:27:36.308854 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:36.308836 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9mzwz" podUID="a5a0550e-4a4c-4a4b-841e-64468d8467ce" Apr 17 17:27:36.710160 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.710127 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cr9jd"] Apr 17 17:27:36.713429 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.713393 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.718807 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.718785 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:27:36.718807 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.718796 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-68nj2\"" Apr 17 17:27:36.718971 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.718825 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:27:36.718971 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.718786 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:27:36.718971 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.718919 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:27:36.724932 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.724901 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cr9jd"] Apr 17 17:27:36.815806 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.815781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/737470dd-5a5c-4575-9059-1060d3ebbea6-crio-socket\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.815931 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.815828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/737470dd-5a5c-4575-9059-1060d3ebbea6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.815931 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.815861 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/737470dd-5a5c-4575-9059-1060d3ebbea6-data-volume\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.816014 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.815922 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/737470dd-5a5c-4575-9059-1060d3ebbea6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.816014 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.815957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdh95\" (UniqueName: \"kubernetes.io/projected/737470dd-5a5c-4575-9059-1060d3ebbea6-kube-api-access-cdh95\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.916620 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.916582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/737470dd-5a5c-4575-9059-1060d3ebbea6-crio-socket\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.916789 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.916632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/737470dd-5a5c-4575-9059-1060d3ebbea6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.916789 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.916658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/737470dd-5a5c-4575-9059-1060d3ebbea6-data-volume\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.916789 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.916677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/737470dd-5a5c-4575-9059-1060d3ebbea6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.916789 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.916693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdh95\" (UniqueName: \"kubernetes.io/projected/737470dd-5a5c-4575-9059-1060d3ebbea6-kube-api-access-cdh95\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.916789 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.916712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/737470dd-5a5c-4575-9059-1060d3ebbea6-crio-socket\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.917073 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.917052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/737470dd-5a5c-4575-9059-1060d3ebbea6-data-volume\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.917283 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.917267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/737470dd-5a5c-4575-9059-1060d3ebbea6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.918943 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.918924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/737470dd-5a5c-4575-9059-1060d3ebbea6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:36.960409 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:36.960344 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdh95\" (UniqueName: \"kubernetes.io/projected/737470dd-5a5c-4575-9059-1060d3ebbea6-kube-api-access-cdh95\") pod \"insights-runtime-extractor-cr9jd\" (UID: \"737470dd-5a5c-4575-9059-1060d3ebbea6\") " pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:37.024224 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:37.024161 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cr9jd" Apr 17 17:27:37.137504 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:37.137471 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cr9jd"] Apr 17 17:27:37.140539 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:27:37.140511 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod737470dd_5a5c_4575_9059_1060d3ebbea6.slice/crio-a7be8b9765fe834ac3f416b056110195beda043fed01f03e8b134ee1b3169585 WatchSource:0}: Error finding container a7be8b9765fe834ac3f416b056110195beda043fed01f03e8b134ee1b3169585: Status 404 returned error can't find the container with id a7be8b9765fe834ac3f416b056110195beda043fed01f03e8b134ee1b3169585 Apr 17 17:27:37.299672 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:37.299638 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cr9jd" event={"ID":"737470dd-5a5c-4575-9059-1060d3ebbea6","Type":"ContainerStarted","Data":"3210bd7c953351cec50ccb4c87e9d56f881387b4722e6f07d5e5b159f02594b0"} Apr 17 17:27:37.299672 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:37.299657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:27:37.299672 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:37.299674 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cr9jd" event={"ID":"737470dd-5a5c-4575-9059-1060d3ebbea6","Type":"ContainerStarted","Data":"a7be8b9765fe834ac3f416b056110195beda043fed01f03e8b134ee1b3169585"} Apr 17 17:27:37.300197 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:37.299830 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g8tgx" Apr 17 17:27:37.863440 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:37.863413 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cqtr2" podUID="f3033f4c-b4a1-45de-8f08-0fbf65425c86" Apr 17 17:27:38.302987 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:38.302952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cr9jd" event={"ID":"737470dd-5a5c-4575-9059-1060d3ebbea6","Type":"ContainerStarted","Data":"6d11e4c452aa6dd4f7aa057b00ebec49d06b13ec18453c3cb49fc25087f25155"} Apr 17 17:27:40.309435 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:40.309405 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cr9jd" event={"ID":"737470dd-5a5c-4575-9059-1060d3ebbea6","Type":"ContainerStarted","Data":"fb57afe5c875056c1d88e4acab130ebc2c0fc89269f866bbbd948318ab2e3d68"} Apr 17 17:27:40.328033 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:40.327991 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cr9jd" podStartSLOduration=2.110498449 podStartE2EDuration="4.32797918s" podCreationTimestamp="2026-04-17 17:27:36 +0000 UTC" firstStartedPulling="2026-04-17 17:27:37.194514115 +0000 UTC m=+156.930727933" lastFinishedPulling="2026-04-17 17:27:39.411994851 +0000 UTC m=+159.148208664" observedRunningTime="2026-04-17 17:27:40.327485182 +0000 UTC m=+160.063699018" watchObservedRunningTime="2026-04-17 17:27:40.32797918 +0000 UTC m=+160.064193043" Apr 17 17:27:41.252911 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.252871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:27:41.253104 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.252924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:27:41.253104 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.252949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:27:41.255256 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.255221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01-metrics-tls\") pod \"dns-default-g8tgx\" (UID: \"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01\") " pod="openshift-dns/dns-default-g8tgx" Apr 17 17:27:41.255364 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.255303 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a5a0550e-4a4c-4a4b-841e-64468d8467ce-cert\") pod \"ingress-canary-9mzwz\" (UID: \"a5a0550e-4a4c-4a4b-841e-64468d8467ce\") " pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:27:41.255364 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.255317 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"image-registry-fc76bc7f-f8tx7\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:27:41.399592 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.399568 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hlc25\"" Apr 17 17:27:41.407972 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.407954 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:27:41.503188 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.503104 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fhhs9\"" Apr 17 17:27:41.503312 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.503160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w4qb9\"" Apr 17 17:27:41.511229 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.511207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9mzwz" Apr 17 17:27:41.511229 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.511218 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-g8tgx" Apr 17 17:27:41.519906 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.519887 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-fc76bc7f-f8tx7"] Apr 17 17:27:41.522144 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:27:41.522117 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc886c27b_9257_4fdc_b97a_21a2d21fe963.slice/crio-ff1e529f24417597e0ee0fb4cb3e4517dd21eae6fb81f4278968e4307b13f196 WatchSource:0}: Error finding container ff1e529f24417597e0ee0fb4cb3e4517dd21eae6fb81f4278968e4307b13f196: Status 404 returned error can't find the container with id ff1e529f24417597e0ee0fb4cb3e4517dd21eae6fb81f4278968e4307b13f196 Apr 17 17:27:41.634153 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.633999 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9mzwz"] Apr 17 17:27:41.636578 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:27:41.636540 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5a0550e_4a4c_4a4b_841e_64468d8467ce.slice/crio-9c268ebdd61f7ea6e0390931aec107172b0628221bc3a78003961ed32eff44ea WatchSource:0}: Error finding container 9c268ebdd61f7ea6e0390931aec107172b0628221bc3a78003961ed32eff44ea: Status 404 returned error can't find the container with id 9c268ebdd61f7ea6e0390931aec107172b0628221bc3a78003961ed32eff44ea Apr 17 17:27:41.650954 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:41.650930 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-g8tgx"] Apr 17 17:27:41.653591 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:27:41.653565 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcceba8d9_fe83_48a9_9faf_5ce2fdf1dc01.slice/crio-3f5e1f5255e3c4adc9f3602a041e1c7472a02cdb8444b0e5455e23ab082abe75 WatchSource:0}: Error finding container 3f5e1f5255e3c4adc9f3602a041e1c7472a02cdb8444b0e5455e23ab082abe75: Status 404 returned error can't find the container with id 3f5e1f5255e3c4adc9f3602a041e1c7472a02cdb8444b0e5455e23ab082abe75 Apr 17 17:27:42.034699 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.034637 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" podUID="3cc90e67-acf5-4042-8e41-ae5f86e4249d" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/readyz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 17 17:27:42.316197 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.316083 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9mzwz" event={"ID":"a5a0550e-4a4c-4a4b-841e-64468d8467ce","Type":"ContainerStarted","Data":"9c268ebdd61f7ea6e0390931aec107172b0628221bc3a78003961ed32eff44ea"} Apr 17 17:27:42.317576 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.317542 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g8tgx" event={"ID":"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01","Type":"ContainerStarted","Data":"3f5e1f5255e3c4adc9f3602a041e1c7472a02cdb8444b0e5455e23ab082abe75"} Apr 17 17:27:42.319179 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.319147 2574 generic.go:358] "Generic (PLEG): container finished" podID="c3ed62d4-67a6-4ebf-939c-d3a20b609d91" containerID="226bc49b629767e9845f743025783a69639934d4574935145b38317667fa7a78" exitCode=255 Apr 17 17:27:42.319308 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.319199 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" event={"ID":"c3ed62d4-67a6-4ebf-939c-d3a20b609d91","Type":"ContainerDied","Data":"226bc49b629767e9845f743025783a69639934d4574935145b38317667fa7a78"} Apr 17 17:27:42.319563 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.319545 2574 scope.go:117] "RemoveContainer" containerID="226bc49b629767e9845f743025783a69639934d4574935145b38317667fa7a78" Apr 17 17:27:42.320775 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.320753 2574 generic.go:358] "Generic (PLEG): container finished" podID="3cc90e67-acf5-4042-8e41-ae5f86e4249d" containerID="4710e283344af990331a660ded52ef43d5c57c8ff366afa58d39ab0f92fa8cd7" exitCode=1 Apr 17 17:27:42.320849 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.320815 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" event={"ID":"3cc90e67-acf5-4042-8e41-ae5f86e4249d","Type":"ContainerDied","Data":"4710e283344af990331a660ded52ef43d5c57c8ff366afa58d39ab0f92fa8cd7"} Apr 17 17:27:42.321116 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.321098 2574 scope.go:117] "RemoveContainer" containerID="4710e283344af990331a660ded52ef43d5c57c8ff366afa58d39ab0f92fa8cd7" Apr 17 17:27:42.323584 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.323556 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" event={"ID":"c886c27b-9257-4fdc-b97a-21a2d21fe963","Type":"ContainerStarted","Data":"7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa"} Apr 17 17:27:42.323689 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.323589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" event={"ID":"c886c27b-9257-4fdc-b97a-21a2d21fe963","Type":"ContainerStarted","Data":"ff1e529f24417597e0ee0fb4cb3e4517dd21eae6fb81f4278968e4307b13f196"} Apr 17 17:27:42.323811 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.323749 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:27:42.358285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:42.358232 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" podStartSLOduration=161.358217103 podStartE2EDuration="2m41.358217103s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:27:42.357502009 +0000 UTC m=+162.093715864" watchObservedRunningTime="2026-04-17 17:27:42.358217103 +0000 UTC m=+162.094430937" Apr 17 17:27:43.327646 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:43.327608 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-846b6797f7-bmc2p" event={"ID":"c3ed62d4-67a6-4ebf-939c-d3a20b609d91","Type":"ContainerStarted","Data":"09b5471875d2d07cdc84ea8cc2d52aa8793bbb478287efd634be3aaece9f8d73"} Apr 17 17:27:43.329482 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:43.329451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" event={"ID":"3cc90e67-acf5-4042-8e41-ae5f86e4249d","Type":"ContainerStarted","Data":"61783721575e05cf4868c31aa7bc5d22512db8925510f79457308433c03ae2e9"} Apr 17 17:27:43.329794 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:43.329748 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:27:43.330370 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:43.330347 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-b75f6f878-d98w6" Apr 17 17:27:44.332788 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:44.332745 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9mzwz" event={"ID":"a5a0550e-4a4c-4a4b-841e-64468d8467ce","Type":"ContainerStarted","Data":"1773a5536ad575fe6b3c17626f5c22dfee661d89f4a161dae893b9df6553b378"} Apr 17 17:27:44.334242 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:44.334220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g8tgx" event={"ID":"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01","Type":"ContainerStarted","Data":"a0fd5514d72135a8bb043fa7b960a92d79f43b1730d40cdbfd5da408c937c92d"} Apr 17 17:27:44.334342 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:44.334249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-g8tgx" event={"ID":"cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01","Type":"ContainerStarted","Data":"2ff45a711395aa91ddd9a76ea67bb1ee426052d139e0a15e05332b7a95159a50"} Apr 17 17:27:44.334342 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:44.334318 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-g8tgx" Apr 17 17:27:44.349122 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:44.349077 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9mzwz" podStartSLOduration=129.375459235 podStartE2EDuration="2m11.349066245s" podCreationTimestamp="2026-04-17 17:25:33 +0000 UTC" firstStartedPulling="2026-04-17 17:27:41.639078454 +0000 UTC m=+161.375292267" lastFinishedPulling="2026-04-17 17:27:43.61268546 +0000 UTC m=+163.348899277" observedRunningTime="2026-04-17 17:27:44.348296212 +0000 UTC m=+164.084510048" watchObservedRunningTime="2026-04-17 17:27:44.349066245 +0000 UTC m=+164.085280070" Apr 17 17:27:45.075978 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.075927 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-g8tgx" podStartSLOduration=130.121885897 podStartE2EDuration="2m12.075909883s" podCreationTimestamp="2026-04-17 17:25:33 +0000 UTC" firstStartedPulling="2026-04-17 17:27:41.65513365 +0000 UTC m=+161.391347462" lastFinishedPulling="2026-04-17 17:27:43.609157636 +0000 UTC m=+163.345371448" observedRunningTime="2026-04-17 17:27:44.365782915 +0000 UTC m=+164.101996750" watchObservedRunningTime="2026-04-17 17:27:45.075909883 +0000 UTC m=+164.812123717" Apr 17 17:27:45.076744 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.076725 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9k9j9"] Apr 17 17:27:45.079844 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.079815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.082274 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.082255 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:27:45.082463 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.082445 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:27:45.082521 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.082495 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:27:45.083682 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.083665 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:27:45.083682 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.083676 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4qm8w\"" Apr 17 17:27:45.083792 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.083674 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:27:45.083792 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.083678 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:27:45.185625 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9de9badb-14ff-4855-9c22-842335059617-node-exporter-textfile\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.185625 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.185814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zntqh\" (UniqueName: \"kubernetes.io/projected/9de9badb-14ff-4855-9c22-842335059617-kube-api-access-zntqh\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.185814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185676 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.185814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-root\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.185814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-sys\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.185814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9de9badb-14ff-4855-9c22-842335059617-node-exporter-accelerators-collector-config\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.185814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-node-exporter-wtmp\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.185814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.185810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9de9badb-14ff-4855-9c22-842335059617-metrics-client-ca\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287116 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9de9badb-14ff-4855-9c22-842335059617-node-exporter-textfile\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287283 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287283 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zntqh\" (UniqueName: \"kubernetes.io/projected/9de9badb-14ff-4855-9c22-842335059617-kube-api-access-zntqh\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287283 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287283 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-root\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287283 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-sys\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287467 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9de9badb-14ff-4855-9c22-842335059617-node-exporter-accelerators-collector-config\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287467 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-node-exporter-wtmp\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287467 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:45.287338 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:27:45.287467 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-sys\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287467 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287371 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9de9badb-14ff-4855-9c22-842335059617-metrics-client-ca\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287467 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:45.287402 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls podName:9de9badb-14ff-4855-9c22-842335059617 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:45.787383708 +0000 UTC m=+165.523597522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls") pod "node-exporter-9k9j9" (UID: "9de9badb-14ff-4855-9c22-842335059617") : secret "node-exporter-tls" not found Apr 17 17:27:45.287467 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287336 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-root\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287695 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9de9badb-14ff-4855-9c22-842335059617-node-exporter-wtmp\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287695 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9de9badb-14ff-4855-9c22-842335059617-node-exporter-textfile\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.287903 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9de9badb-14ff-4855-9c22-842335059617-node-exporter-accelerators-collector-config\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.288006 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.287914 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9de9badb-14ff-4855-9c22-842335059617-metrics-client-ca\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.289558 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.289532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.299016 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.298996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zntqh\" (UniqueName: \"kubernetes.io/projected/9de9badb-14ff-4855-9c22-842335059617-kube-api-access-zntqh\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.790721 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:45.790690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:45.791062 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:45.790810 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:27:45.791062 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:27:45.790860 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls podName:9de9badb-14ff-4855-9c22-842335059617 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:46.790847069 +0000 UTC m=+166.527060881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls") pod "node-exporter-9k9j9" (UID: "9de9badb-14ff-4855-9c22-842335059617") : secret "node-exporter-tls" not found Apr 17 17:27:46.796008 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:46.795974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:46.798199 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:46.798158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9de9badb-14ff-4855-9c22-842335059617-node-exporter-tls\") pod \"node-exporter-9k9j9\" (UID: \"9de9badb-14ff-4855-9c22-842335059617\") " pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:46.888453 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:46.888427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9k9j9" Apr 17 17:27:46.898311 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:27:46.898287 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de9badb_14ff_4855_9c22_842335059617.slice/crio-921b37712fc2006a2f0133a8d5964ed0f34650aaedd16d4771214a3728d49fbb WatchSource:0}: Error finding container 921b37712fc2006a2f0133a8d5964ed0f34650aaedd16d4771214a3728d49fbb: Status 404 returned error can't find the container with id 921b37712fc2006a2f0133a8d5964ed0f34650aaedd16d4771214a3728d49fbb Apr 17 17:27:47.343114 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:47.343078 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9k9j9" event={"ID":"9de9badb-14ff-4855-9c22-842335059617","Type":"ContainerStarted","Data":"921b37712fc2006a2f0133a8d5964ed0f34650aaedd16d4771214a3728d49fbb"} Apr 17 17:27:48.346943 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:48.346911 2574 generic.go:358] "Generic (PLEG): container finished" podID="9de9badb-14ff-4855-9c22-842335059617" containerID="aa064c8cc1dcf3b19223cdf13d2ef8c9897e40342e7dce24ee510e3ed11ac6f2" exitCode=0 Apr 17 17:27:48.347312 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:48.346982 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9k9j9" event={"ID":"9de9badb-14ff-4855-9c22-842335059617","Type":"ContainerDied","Data":"aa064c8cc1dcf3b19223cdf13d2ef8c9897e40342e7dce24ee510e3ed11ac6f2"} Apr 17 17:27:48.853930 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:48.853896 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:27:49.351068 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:49.351033 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9k9j9" event={"ID":"9de9badb-14ff-4855-9c22-842335059617","Type":"ContainerStarted","Data":"a41283ec7108288002f92a103199fce3af145a2e08a00d91abf70022e9ddf0c9"} Apr 17 17:27:49.351068 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:49.351072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9k9j9" event={"ID":"9de9badb-14ff-4855-9c22-842335059617","Type":"ContainerStarted","Data":"5dca0a30ca2c51a27aaf9f194a38649f630d017ca0f9c0e1bea3fd696079a151"} Apr 17 17:27:49.372958 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:49.372910 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9k9j9" podStartSLOduration=3.5819313470000003 podStartE2EDuration="4.372898358s" podCreationTimestamp="2026-04-17 17:27:45 +0000 UTC" firstStartedPulling="2026-04-17 17:27:46.900475469 +0000 UTC m=+166.636689286" lastFinishedPulling="2026-04-17 17:27:47.691442483 +0000 UTC m=+167.427656297" observedRunningTime="2026-04-17 17:27:49.371060062 +0000 UTC m=+169.107273895" watchObservedRunningTime="2026-04-17 17:27:49.372898358 +0000 UTC m=+169.109112192" Apr 17 17:27:54.339370 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:54.339336 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-g8tgx" Apr 17 17:27:58.463558 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:58.463525 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-fc76bc7f-f8tx7"] Apr 17 17:27:58.467461 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:58.467436 2574 patch_prober.go:28] interesting pod/image-registry-fc76bc7f-f8tx7 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:27:58.467579 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:27:58.467477 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" podUID="c886c27b-9257-4fdc-b97a-21a2d21fe963" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:28:08.468392 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:08.468363 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:28:23.481380 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.481319 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" podUID="c886c27b-9257-4fdc-b97a-21a2d21fe963" containerName="registry" containerID="cri-o://7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa" gracePeriod=30 Apr 17 17:28:23.712988 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.712968 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:28:23.876618 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.876581 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-certificates\") pod \"c886c27b-9257-4fdc-b97a-21a2d21fe963\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " Apr 17 17:28:23.876618 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.876625 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-image-registry-private-configuration\") pod \"c886c27b-9257-4fdc-b97a-21a2d21fe963\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " Apr 17 17:28:23.876885 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.876667 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-trusted-ca\") pod \"c886c27b-9257-4fdc-b97a-21a2d21fe963\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " Apr 17 17:28:23.876885 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.876690 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-bound-sa-token\") pod \"c886c27b-9257-4fdc-b97a-21a2d21fe963\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " Apr 17 17:28:23.876885 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.876707 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rphf\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-kube-api-access-5rphf\") pod \"c886c27b-9257-4fdc-b97a-21a2d21fe963\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " Apr 17 17:28:23.876885 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.876723 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") pod \"c886c27b-9257-4fdc-b97a-21a2d21fe963\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " Apr 17 17:28:23.876885 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.876739 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-installation-pull-secrets\") pod \"c886c27b-9257-4fdc-b97a-21a2d21fe963\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " Apr 17 17:28:23.876885 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.876761 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c886c27b-9257-4fdc-b97a-21a2d21fe963-ca-trust-extracted\") pod \"c886c27b-9257-4fdc-b97a-21a2d21fe963\" (UID: \"c886c27b-9257-4fdc-b97a-21a2d21fe963\") " Apr 17 17:28:23.877196 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.877081 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c886c27b-9257-4fdc-b97a-21a2d21fe963" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:23.877259 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.877161 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c886c27b-9257-4fdc-b97a-21a2d21fe963" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:23.879346 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.879288 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c886c27b-9257-4fdc-b97a-21a2d21fe963" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:23.879470 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.879381 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c886c27b-9257-4fdc-b97a-21a2d21fe963" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:23.879470 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.879384 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c886c27b-9257-4fdc-b97a-21a2d21fe963" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:23.879581 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.879547 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c886c27b-9257-4fdc-b97a-21a2d21fe963" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:23.879645 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.879620 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-kube-api-access-5rphf" (OuterVolumeSpecName: "kube-api-access-5rphf") pod "c886c27b-9257-4fdc-b97a-21a2d21fe963" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963"). InnerVolumeSpecName "kube-api-access-5rphf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:23.885514 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.885492 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c886c27b-9257-4fdc-b97a-21a2d21fe963-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c886c27b-9257-4fdc-b97a-21a2d21fe963" (UID: "c886c27b-9257-4fdc-b97a-21a2d21fe963"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:28:23.977231 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.977198 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5rphf\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-kube-api-access-5rphf\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:28:23.977231 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.977227 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:28:23.977231 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.977237 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-installation-pull-secrets\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:28:23.977436 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.977246 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c886c27b-9257-4fdc-b97a-21a2d21fe963-ca-trust-extracted\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:28:23.977436 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.977256 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-registry-certificates\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:28:23.977436 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.977265 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c886c27b-9257-4fdc-b97a-21a2d21fe963-image-registry-private-configuration\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:28:23.977436 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.977274 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c886c27b-9257-4fdc-b97a-21a2d21fe963-trusted-ca\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:28:23.977436 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:23.977283 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c886c27b-9257-4fdc-b97a-21a2d21fe963-bound-sa-token\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:28:24.438434 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.438403 2574 generic.go:358] "Generic (PLEG): container finished" podID="c886c27b-9257-4fdc-b97a-21a2d21fe963" containerID="7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa" exitCode=0 Apr 17 17:28:24.438605 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.438455 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" Apr 17 17:28:24.438605 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.438456 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" event={"ID":"c886c27b-9257-4fdc-b97a-21a2d21fe963","Type":"ContainerDied","Data":"7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa"} Apr 17 17:28:24.438605 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.438573 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fc76bc7f-f8tx7" event={"ID":"c886c27b-9257-4fdc-b97a-21a2d21fe963","Type":"ContainerDied","Data":"ff1e529f24417597e0ee0fb4cb3e4517dd21eae6fb81f4278968e4307b13f196"} Apr 17 17:28:24.438605 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.438596 2574 scope.go:117] "RemoveContainer" containerID="7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa" Apr 17 17:28:24.445986 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.445966 2574 scope.go:117] "RemoveContainer" containerID="7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa" Apr 17 17:28:24.446395 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:28:24.446365 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa\": container with ID starting with 7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa not found: ID does not exist" containerID="7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa" Apr 17 17:28:24.446480 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.446400 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa"} err="failed to get container status \"7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa\": rpc error: code = NotFound desc = could not find container \"7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa\": container with ID starting with 7827bc577dc72aaee992bcb8d231a8d8fcaecaa2c8401b7ea7cd1e53b64321aa not found: ID does not exist" Apr 17 17:28:24.459524 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.459499 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-fc76bc7f-f8tx7"] Apr 17 17:28:24.463376 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.463356 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-fc76bc7f-f8tx7"] Apr 17 17:28:24.856606 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:24.856576 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c886c27b-9257-4fdc-b97a-21a2d21fe963" path="/var/lib/kubelet/pods/c886c27b-9257-4fdc-b97a-21a2d21fe963/volumes" Apr 17 17:28:33.559698 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:33.559659 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" podUID="d6aaf1d4-521f-41d7-95d8-679d0d7827e2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:28:43.559975 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:43.559936 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" podUID="d6aaf1d4-521f-41d7-95d8-679d0d7827e2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:28:53.559473 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:53.559437 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" podUID="d6aaf1d4-521f-41d7-95d8-679d0d7827e2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 17:28:53.559826 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:53.559508 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" Apr 17 17:28:53.559938 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:53.559919 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"1414c5bbd04ba3f8d7ea13743434c30795ffb9c4823607b8e11f6c01d11148b5"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 17:28:53.559974 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:53.559959 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" podUID="d6aaf1d4-521f-41d7-95d8-679d0d7827e2" containerName="service-proxy" containerID="cri-o://1414c5bbd04ba3f8d7ea13743434c30795ffb9c4823607b8e11f6c01d11148b5" gracePeriod=30 Apr 17 17:28:54.517598 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:54.517568 2574 generic.go:358] "Generic (PLEG): container finished" podID="d6aaf1d4-521f-41d7-95d8-679d0d7827e2" containerID="1414c5bbd04ba3f8d7ea13743434c30795ffb9c4823607b8e11f6c01d11148b5" exitCode=2 Apr 17 17:28:54.517843 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:54.517625 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" event={"ID":"d6aaf1d4-521f-41d7-95d8-679d0d7827e2","Type":"ContainerDied","Data":"1414c5bbd04ba3f8d7ea13743434c30795ffb9c4823607b8e11f6c01d11148b5"} Apr 17 17:28:54.517843 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:28:54.517649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-858ff96b46-l8t4p" event={"ID":"d6aaf1d4-521f-41d7-95d8-679d0d7827e2","Type":"ContainerStarted","Data":"d7889fb3ffb81191eafafae8eec555cc635096116f683a39699efd1d832cb4bb"} Apr 17 17:29:12.629287 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:12.629239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:29:12.631606 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:12.631581 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f3033f4c-b4a1-45de-8f08-0fbf65425c86-metrics-certs\") pod \"network-metrics-daemon-cqtr2\" (UID: \"f3033f4c-b4a1-45de-8f08-0fbf65425c86\") " pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:29:12.857821 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:12.857676 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lkqvr\"" Apr 17 17:29:12.865178 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:12.865144 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cqtr2" Apr 17 17:29:12.977348 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:12.977323 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cqtr2"] Apr 17 17:29:12.978932 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:29:12.978907 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3033f4c_b4a1_45de_8f08_0fbf65425c86.slice/crio-7f614d936c6252931c3c468758c36ca67424bba1b6eef0b783d9206ec7fecd67 WatchSource:0}: Error finding container 7f614d936c6252931c3c468758c36ca67424bba1b6eef0b783d9206ec7fecd67: Status 404 returned error can't find the container with id 7f614d936c6252931c3c468758c36ca67424bba1b6eef0b783d9206ec7fecd67 Apr 17 17:29:13.563397 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:13.563366 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cqtr2" event={"ID":"f3033f4c-b4a1-45de-8f08-0fbf65425c86","Type":"ContainerStarted","Data":"7f614d936c6252931c3c468758c36ca67424bba1b6eef0b783d9206ec7fecd67"} Apr 17 17:29:14.567608 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:14.567571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cqtr2" event={"ID":"f3033f4c-b4a1-45de-8f08-0fbf65425c86","Type":"ContainerStarted","Data":"7a91e4fa878bc2f6c7dab6c793f34cddd1c376781a7c5edd3abab3969313e6a7"} Apr 17 17:29:14.567608 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:14.567612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cqtr2" event={"ID":"f3033f4c-b4a1-45de-8f08-0fbf65425c86","Type":"ContainerStarted","Data":"79e0f5236cb4d5669294807d57864ac8d3057106b39d7e8ca403e8a17a5cee13"} Apr 17 17:29:14.590561 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:29:14.590511 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cqtr2" podStartSLOduration=252.449347959 podStartE2EDuration="4m13.590497466s" podCreationTimestamp="2026-04-17 17:25:01 +0000 UTC" firstStartedPulling="2026-04-17 17:29:12.980822726 +0000 UTC m=+252.717036542" lastFinishedPulling="2026-04-17 17:29:14.121972237 +0000 UTC m=+253.858186049" observedRunningTime="2026-04-17 17:29:14.589581926 +0000 UTC m=+254.325795760" watchObservedRunningTime="2026-04-17 17:29:14.590497466 +0000 UTC m=+254.326711298" Apr 17 17:30:00.767955 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:30:00.767926 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:32:23.209844 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.209766 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z"] Apr 17 17:32:23.210296 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.210013 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c886c27b-9257-4fdc-b97a-21a2d21fe963" containerName="registry" Apr 17 17:32:23.210296 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.210024 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c886c27b-9257-4fdc-b97a-21a2d21fe963" containerName="registry" Apr 17 17:32:23.210296 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.210072 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c886c27b-9257-4fdc-b97a-21a2d21fe963" containerName="registry" Apr 17 17:32:23.212766 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.212749 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.215435 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.215413 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 17:32:23.215547 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.215458 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 17:32:23.220430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.216842 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-p7mt4\"" Apr 17 17:32:23.220430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.216988 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 17:32:23.220430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.217100 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 17:32:23.220430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.217274 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 17:32:23.224649 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.224589 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z"] Apr 17 17:32:23.299066 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.299027 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/759d9b6c-8c7b-4b47-9b54-f8052983704a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.299250 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.299077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.299250 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.299094 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqc6b\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-kube-api-access-hqc6b\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.399721 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.399682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/759d9b6c-8c7b-4b47-9b54-f8052983704a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.399827 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.399738 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.399827 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.399758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqc6b\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-kube-api-access-hqc6b\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.399903 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.399836 2574 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:32:23.399903 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.399859 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:32:23.399903 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.399877 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z: references non-existent secret key: tls.crt Apr 17 17:32:23.400005 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.399929 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates podName:759d9b6c-8c7b-4b47-9b54-f8052983704a nodeName:}" failed. No retries permitted until 2026-04-17 17:32:23.89991424 +0000 UTC m=+443.636128052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates") pod "keda-metrics-apiserver-7c9f485588-5dx7z" (UID: "759d9b6c-8c7b-4b47-9b54-f8052983704a") : references non-existent secret key: tls.crt Apr 17 17:32:23.400059 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.400026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/759d9b6c-8c7b-4b47-9b54-f8052983704a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.413197 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.413157 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqc6b\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-kube-api-access-hqc6b\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.519489 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.519455 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-xbsrn"] Apr 17 17:32:23.522447 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.522430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:23.525012 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.524992 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 17:32:23.530870 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.530846 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xbsrn"] Apr 17 17:32:23.601190 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.601132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f8rt\" (UniqueName: \"kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-kube-api-access-2f8rt\") pod \"keda-admission-cf49989db-xbsrn\" (UID: \"06de0cf4-5c31-4ee4-8aa0-19170c4069b3\") " pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:23.601190 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.601191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-certificates\") pod \"keda-admission-cf49989db-xbsrn\" (UID: \"06de0cf4-5c31-4ee4-8aa0-19170c4069b3\") " pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:23.701598 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.701562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2f8rt\" (UniqueName: \"kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-kube-api-access-2f8rt\") pod \"keda-admission-cf49989db-xbsrn\" (UID: \"06de0cf4-5c31-4ee4-8aa0-19170c4069b3\") " pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:23.701788 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.701603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-certificates\") pod \"keda-admission-cf49989db-xbsrn\" (UID: \"06de0cf4-5c31-4ee4-8aa0-19170c4069b3\") " pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:23.701788 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.701743 2574 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 17 17:32:23.701788 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.701769 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-xbsrn: secret "keda-admission-webhooks-certs" not found Apr 17 17:32:23.701945 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.701832 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-certificates podName:06de0cf4-5c31-4ee4-8aa0-19170c4069b3 nodeName:}" failed. No retries permitted until 2026-04-17 17:32:24.201812166 +0000 UTC m=+443.938025984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-certificates") pod "keda-admission-cf49989db-xbsrn" (UID: "06de0cf4-5c31-4ee4-8aa0-19170c4069b3") : secret "keda-admission-webhooks-certs" not found Apr 17 17:32:23.710642 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.710609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f8rt\" (UniqueName: \"kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-kube-api-access-2f8rt\") pod \"keda-admission-cf49989db-xbsrn\" (UID: \"06de0cf4-5c31-4ee4-8aa0-19170c4069b3\") " pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:23.903965 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:23.903869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:23.904092 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.904003 2574 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:32:23.904092 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.904023 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:32:23.904092 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.904040 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z: references non-existent secret key: tls.crt Apr 17 17:32:23.904092 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:23.904089 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates podName:759d9b6c-8c7b-4b47-9b54-f8052983704a nodeName:}" failed. No retries permitted until 2026-04-17 17:32:24.90407668 +0000 UTC m=+444.640290493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates") pod "keda-metrics-apiserver-7c9f485588-5dx7z" (UID: "759d9b6c-8c7b-4b47-9b54-f8052983704a") : references non-existent secret key: tls.crt Apr 17 17:32:24.206923 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:24.206848 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-certificates\") pod \"keda-admission-cf49989db-xbsrn\" (UID: \"06de0cf4-5c31-4ee4-8aa0-19170c4069b3\") " pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:24.209101 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:24.209081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/06de0cf4-5c31-4ee4-8aa0-19170c4069b3-certificates\") pod \"keda-admission-cf49989db-xbsrn\" (UID: \"06de0cf4-5c31-4ee4-8aa0-19170c4069b3\") " pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:24.433139 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:24.433094 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:24.551697 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:24.551666 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xbsrn"] Apr 17 17:32:24.557348 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:32:24.557312 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06de0cf4_5c31_4ee4_8aa0_19170c4069b3.slice/crio-a9fd40733623b99a26a60821b1ab94a7af94faf0c2d7ac63a67ac1778895cd87 WatchSource:0}: Error finding container a9fd40733623b99a26a60821b1ab94a7af94faf0c2d7ac63a67ac1778895cd87: Status 404 returned error can't find the container with id a9fd40733623b99a26a60821b1ab94a7af94faf0c2d7ac63a67ac1778895cd87 Apr 17 17:32:24.558072 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:24.558054 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:32:24.911289 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:24.911252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:24.911456 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:24.911398 2574 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:32:24.911456 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:24.911412 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:32:24.911456 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:24.911428 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z: references non-existent secret key: tls.crt Apr 17 17:32:24.911552 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:24.911481 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates podName:759d9b6c-8c7b-4b47-9b54-f8052983704a nodeName:}" failed. No retries permitted until 2026-04-17 17:32:26.911465452 +0000 UTC m=+446.647679280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates") pod "keda-metrics-apiserver-7c9f485588-5dx7z" (UID: "759d9b6c-8c7b-4b47-9b54-f8052983704a") : references non-existent secret key: tls.crt Apr 17 17:32:25.017575 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:25.017541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xbsrn" event={"ID":"06de0cf4-5c31-4ee4-8aa0-19170c4069b3","Type":"ContainerStarted","Data":"a9fd40733623b99a26a60821b1ab94a7af94faf0c2d7ac63a67ac1778895cd87"} Apr 17 17:32:26.926264 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:26.926223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:26.926720 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:26.926393 2574 secret.go:281] references non-existent secret key: tls.crt Apr 17 17:32:26.926720 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:26.926417 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 17:32:26.926720 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:26.926442 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z: references non-existent secret key: tls.crt Apr 17 17:32:26.926720 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:32:26.926509 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates podName:759d9b6c-8c7b-4b47-9b54-f8052983704a nodeName:}" failed. No retries permitted until 2026-04-17 17:32:30.926489605 +0000 UTC m=+450.662703437 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates") pod "keda-metrics-apiserver-7c9f485588-5dx7z" (UID: "759d9b6c-8c7b-4b47-9b54-f8052983704a") : references non-existent secret key: tls.crt Apr 17 17:32:28.026448 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:28.026410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xbsrn" event={"ID":"06de0cf4-5c31-4ee4-8aa0-19170c4069b3","Type":"ContainerStarted","Data":"9152c7bd935c7c88c59ca5c119d62956d14a4da75f9ad4137639b926d50bf1cf"} Apr 17 17:32:28.026835 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:28.026527 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:32:28.043646 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:28.043599 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-xbsrn" podStartSLOduration=1.76509266 podStartE2EDuration="5.043585074s" podCreationTimestamp="2026-04-17 17:32:23 +0000 UTC" firstStartedPulling="2026-04-17 17:32:24.558235099 +0000 UTC m=+444.294448912" lastFinishedPulling="2026-04-17 17:32:27.836727501 +0000 UTC m=+447.572941326" observedRunningTime="2026-04-17 17:32:28.042841737 +0000 UTC m=+447.779055569" watchObservedRunningTime="2026-04-17 17:32:28.043585074 +0000 UTC m=+447.779798911" Apr 17 17:32:30.957953 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:30.957914 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:30.960482 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:30.960458 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/759d9b6c-8c7b-4b47-9b54-f8052983704a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-5dx7z\" (UID: \"759d9b6c-8c7b-4b47-9b54-f8052983704a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:31.027394 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:31.027344 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:31.138735 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:31.138708 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z"] Apr 17 17:32:31.141281 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:32:31.141250 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod759d9b6c_8c7b_4b47_9b54_f8052983704a.slice/crio-105d57719c84345757b23fd5a53b5eae8c43ce680a690568e49fa3c7190128a7 WatchSource:0}: Error finding container 105d57719c84345757b23fd5a53b5eae8c43ce680a690568e49fa3c7190128a7: Status 404 returned error can't find the container with id 105d57719c84345757b23fd5a53b5eae8c43ce680a690568e49fa3c7190128a7 Apr 17 17:32:32.037184 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:32.037126 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" event={"ID":"759d9b6c-8c7b-4b47-9b54-f8052983704a","Type":"ContainerStarted","Data":"105d57719c84345757b23fd5a53b5eae8c43ce680a690568e49fa3c7190128a7"} Apr 17 17:32:34.043529 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:34.043491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" event={"ID":"759d9b6c-8c7b-4b47-9b54-f8052983704a","Type":"ContainerStarted","Data":"eb58a94ed6883d9deb78a93c15a6324ff68c15eaa14abe02994fd6f7f2d62e55"} Apr 17 17:32:34.043982 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:34.043641 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:34.062270 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:34.062221 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" podStartSLOduration=8.337830797 podStartE2EDuration="11.062207203s" podCreationTimestamp="2026-04-17 17:32:23 +0000 UTC" firstStartedPulling="2026-04-17 17:32:31.142666114 +0000 UTC m=+450.878879941" lastFinishedPulling="2026-04-17 17:32:33.867042532 +0000 UTC m=+453.603256347" observedRunningTime="2026-04-17 17:32:34.06153566 +0000 UTC m=+453.797749497" watchObservedRunningTime="2026-04-17 17:32:34.062207203 +0000 UTC m=+453.798421248" Apr 17 17:32:45.050208 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:45.050155 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-5dx7z" Apr 17 17:32:49.031083 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:32:49.031047 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-xbsrn" Apr 17 17:33:29.214547 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.214505 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-w4vx9"] Apr 17 17:33:29.217430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.217408 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:29.220764 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.220746 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 17:33:29.221030 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.221011 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:33:29.221139 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.221017 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:33:29.221880 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.221867 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-g5wcj\"" Apr 17 17:33:29.226125 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.226103 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-w4vx9"] Apr 17 17:33:29.259381 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.259348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/545651ab-36d6-443f-8b27-1c4c851d22ce-data\") pod \"seaweedfs-86cc847c5c-w4vx9\" (UID: \"545651ab-36d6-443f-8b27-1c4c851d22ce\") " pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:29.259536 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.259386 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmpng\" (UniqueName: \"kubernetes.io/projected/545651ab-36d6-443f-8b27-1c4c851d22ce-kube-api-access-gmpng\") pod \"seaweedfs-86cc847c5c-w4vx9\" (UID: \"545651ab-36d6-443f-8b27-1c4c851d22ce\") " pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:29.360149 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.360093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/545651ab-36d6-443f-8b27-1c4c851d22ce-data\") pod \"seaweedfs-86cc847c5c-w4vx9\" (UID: \"545651ab-36d6-443f-8b27-1c4c851d22ce\") " pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:29.360149 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.360152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmpng\" (UniqueName: \"kubernetes.io/projected/545651ab-36d6-443f-8b27-1c4c851d22ce-kube-api-access-gmpng\") pod \"seaweedfs-86cc847c5c-w4vx9\" (UID: \"545651ab-36d6-443f-8b27-1c4c851d22ce\") " pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:29.360569 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.360548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/545651ab-36d6-443f-8b27-1c4c851d22ce-data\") pod \"seaweedfs-86cc847c5c-w4vx9\" (UID: \"545651ab-36d6-443f-8b27-1c4c851d22ce\") " pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:29.374730 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.374705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmpng\" (UniqueName: \"kubernetes.io/projected/545651ab-36d6-443f-8b27-1c4c851d22ce-kube-api-access-gmpng\") pod \"seaweedfs-86cc847c5c-w4vx9\" (UID: \"545651ab-36d6-443f-8b27-1c4c851d22ce\") " pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:29.526870 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.526836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:29.645927 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:29.645896 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-w4vx9"] Apr 17 17:33:29.648492 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:33:29.648465 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545651ab_36d6_443f_8b27_1c4c851d22ce.slice/crio-4857f99fa5a9bde8541961e61c859083e7334d5974b72dd36983e59dfb2ed591 WatchSource:0}: Error finding container 4857f99fa5a9bde8541961e61c859083e7334d5974b72dd36983e59dfb2ed591: Status 404 returned error can't find the container with id 4857f99fa5a9bde8541961e61c859083e7334d5974b72dd36983e59dfb2ed591 Apr 17 17:33:30.187567 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:30.187504 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-w4vx9" event={"ID":"545651ab-36d6-443f-8b27-1c4c851d22ce","Type":"ContainerStarted","Data":"4857f99fa5a9bde8541961e61c859083e7334d5974b72dd36983e59dfb2ed591"} Apr 17 17:33:32.194943 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:32.194903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-w4vx9" event={"ID":"545651ab-36d6-443f-8b27-1c4c851d22ce","Type":"ContainerStarted","Data":"2a0d848842bad6393b1eb0e637a5ca3a2f9ad431b17cd1a17ad32b80c9c9fea7"} Apr 17 17:33:32.195394 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:32.195035 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:33:32.212631 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:32.212581 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-w4vx9" podStartSLOduration=0.760643376 podStartE2EDuration="3.212567665s" podCreationTimestamp="2026-04-17 17:33:29 +0000 UTC" firstStartedPulling="2026-04-17 17:33:29.649680463 +0000 UTC m=+509.385894275" lastFinishedPulling="2026-04-17 17:33:32.101604743 +0000 UTC m=+511.837818564" observedRunningTime="2026-04-17 17:33:32.211069616 +0000 UTC m=+511.947283452" watchObservedRunningTime="2026-04-17 17:33:32.212567665 +0000 UTC m=+511.948781518" Apr 17 17:33:38.200318 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:33:38.200287 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-w4vx9" Apr 17 17:34:39.573179 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.573133 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-v2968"] Apr 17 17:34:39.576120 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.576103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:39.578612 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.578590 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 17:34:39.578718 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.578622 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-sm46h\"" Apr 17 17:34:39.584083 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.584061 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-v2968"] Apr 17 17:34:39.587386 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.587366 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-r4qgk"] Apr 17 17:34:39.590544 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.590527 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:39.592821 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.592797 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-5mbcj\"" Apr 17 17:34:39.592931 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.592911 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 17:34:39.598854 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.598828 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-r4qgk"] Apr 17 17:34:39.671685 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.671647 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03ac2e31-d2ba-494b-87d2-c587f8c20f12-cert\") pod \"odh-model-controller-696fc77849-r4qgk\" (UID: \"03ac2e31-d2ba-494b-87d2-c587f8c20f12\") " pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:39.671839 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.671690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv74z\" (UniqueName: \"kubernetes.io/projected/03ac2e31-d2ba-494b-87d2-c587f8c20f12-kube-api-access-bv74z\") pod \"odh-model-controller-696fc77849-r4qgk\" (UID: \"03ac2e31-d2ba-494b-87d2-c587f8c20f12\") " pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:39.671839 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.671761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kst95\" (UniqueName: \"kubernetes.io/projected/da2262ce-90d6-4ede-88fb-401c133ac820-kube-api-access-kst95\") pod \"model-serving-api-86f7b4b499-v2968\" (UID: \"da2262ce-90d6-4ede-88fb-401c133ac820\") " pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:39.671839 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.671823 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da2262ce-90d6-4ede-88fb-401c133ac820-tls-certs\") pod \"model-serving-api-86f7b4b499-v2968\" (UID: \"da2262ce-90d6-4ede-88fb-401c133ac820\") " pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:39.773013 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.772968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kst95\" (UniqueName: \"kubernetes.io/projected/da2262ce-90d6-4ede-88fb-401c133ac820-kube-api-access-kst95\") pod \"model-serving-api-86f7b4b499-v2968\" (UID: \"da2262ce-90d6-4ede-88fb-401c133ac820\") " pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:39.773013 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.773018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da2262ce-90d6-4ede-88fb-401c133ac820-tls-certs\") pod \"model-serving-api-86f7b4b499-v2968\" (UID: \"da2262ce-90d6-4ede-88fb-401c133ac820\") " pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:39.773243 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.773058 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03ac2e31-d2ba-494b-87d2-c587f8c20f12-cert\") pod \"odh-model-controller-696fc77849-r4qgk\" (UID: \"03ac2e31-d2ba-494b-87d2-c587f8c20f12\") " pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:39.773243 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.773099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv74z\" (UniqueName: \"kubernetes.io/projected/03ac2e31-d2ba-494b-87d2-c587f8c20f12-kube-api-access-bv74z\") pod \"odh-model-controller-696fc77849-r4qgk\" (UID: \"03ac2e31-d2ba-494b-87d2-c587f8c20f12\") " pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:39.773243 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:34:39.773220 2574 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 17:34:39.773343 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:34:39.773291 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03ac2e31-d2ba-494b-87d2-c587f8c20f12-cert podName:03ac2e31-d2ba-494b-87d2-c587f8c20f12 nodeName:}" failed. No retries permitted until 2026-04-17 17:34:40.273274412 +0000 UTC m=+580.009488225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/03ac2e31-d2ba-494b-87d2-c587f8c20f12-cert") pod "odh-model-controller-696fc77849-r4qgk" (UID: "03ac2e31-d2ba-494b-87d2-c587f8c20f12") : secret "odh-model-controller-webhook-cert" not found Apr 17 17:34:39.775536 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.775507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da2262ce-90d6-4ede-88fb-401c133ac820-tls-certs\") pod \"model-serving-api-86f7b4b499-v2968\" (UID: \"da2262ce-90d6-4ede-88fb-401c133ac820\") " pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:39.783922 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.783900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kst95\" (UniqueName: \"kubernetes.io/projected/da2262ce-90d6-4ede-88fb-401c133ac820-kube-api-access-kst95\") pod \"model-serving-api-86f7b4b499-v2968\" (UID: \"da2262ce-90d6-4ede-88fb-401c133ac820\") " pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:39.784040 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.783978 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv74z\" (UniqueName: \"kubernetes.io/projected/03ac2e31-d2ba-494b-87d2-c587f8c20f12-kube-api-access-bv74z\") pod \"odh-model-controller-696fc77849-r4qgk\" (UID: \"03ac2e31-d2ba-494b-87d2-c587f8c20f12\") " pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:39.887147 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.887053 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:39.999604 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:39.999580 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-v2968"] Apr 17 17:34:40.001880 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:34:40.001853 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda2262ce_90d6_4ede_88fb_401c133ac820.slice/crio-d1321b903ee59600239a9bc3e1c20f2e00a4374146644af6629bd2589c25a6dd WatchSource:0}: Error finding container d1321b903ee59600239a9bc3e1c20f2e00a4374146644af6629bd2589c25a6dd: Status 404 returned error can't find the container with id d1321b903ee59600239a9bc3e1c20f2e00a4374146644af6629bd2589c25a6dd Apr 17 17:34:40.276632 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:40.276603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03ac2e31-d2ba-494b-87d2-c587f8c20f12-cert\") pod \"odh-model-controller-696fc77849-r4qgk\" (UID: \"03ac2e31-d2ba-494b-87d2-c587f8c20f12\") " pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:40.278946 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:40.278927 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/03ac2e31-d2ba-494b-87d2-c587f8c20f12-cert\") pod \"odh-model-controller-696fc77849-r4qgk\" (UID: \"03ac2e31-d2ba-494b-87d2-c587f8c20f12\") " pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:40.366655 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:40.366618 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-v2968" event={"ID":"da2262ce-90d6-4ede-88fb-401c133ac820","Type":"ContainerStarted","Data":"d1321b903ee59600239a9bc3e1c20f2e00a4374146644af6629bd2589c25a6dd"} Apr 17 17:34:40.500281 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:40.500240 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:40.650431 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:40.650395 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-r4qgk"] Apr 17 17:34:40.654783 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:34:40.654751 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ac2e31_d2ba_494b_87d2_c587f8c20f12.slice/crio-dfca72ef1d595082ea10e554156f8ffa26bf1f19a71a2da7003f45f711087a1a WatchSource:0}: Error finding container dfca72ef1d595082ea10e554156f8ffa26bf1f19a71a2da7003f45f711087a1a: Status 404 returned error can't find the container with id dfca72ef1d595082ea10e554156f8ffa26bf1f19a71a2da7003f45f711087a1a Apr 17 17:34:41.371150 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:41.371115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-r4qgk" event={"ID":"03ac2e31-d2ba-494b-87d2-c587f8c20f12","Type":"ContainerStarted","Data":"dfca72ef1d595082ea10e554156f8ffa26bf1f19a71a2da7003f45f711087a1a"} Apr 17 17:34:44.380108 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:44.380068 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-r4qgk" event={"ID":"03ac2e31-d2ba-494b-87d2-c587f8c20f12","Type":"ContainerStarted","Data":"a2b7fd22d14217b6c3834fec78d75db68a6eeb0c69be99c938197acc432213ef"} Apr 17 17:34:44.380572 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:44.380163 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:44.381441 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:44.381418 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-v2968" event={"ID":"da2262ce-90d6-4ede-88fb-401c133ac820","Type":"ContainerStarted","Data":"0134428717f72042e1c3dcb5e63363944a56e6fe15d4440f7bf866f34f6f9575"} Apr 17 17:34:44.381589 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:44.381551 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:34:44.397430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:44.397386 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-r4qgk" podStartSLOduration=2.480586269 podStartE2EDuration="5.397353406s" podCreationTimestamp="2026-04-17 17:34:39 +0000 UTC" firstStartedPulling="2026-04-17 17:34:40.656732455 +0000 UTC m=+580.392946281" lastFinishedPulling="2026-04-17 17:34:43.573499601 +0000 UTC m=+583.309713418" observedRunningTime="2026-04-17 17:34:44.396278621 +0000 UTC m=+584.132492481" watchObservedRunningTime="2026-04-17 17:34:44.397353406 +0000 UTC m=+584.133567240" Apr 17 17:34:44.413132 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:44.413083 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-v2968" podStartSLOduration=1.886602006 podStartE2EDuration="5.413069665s" podCreationTimestamp="2026-04-17 17:34:39 +0000 UTC" firstStartedPulling="2026-04-17 17:34:40.003593641 +0000 UTC m=+579.739807454" lastFinishedPulling="2026-04-17 17:34:43.530061301 +0000 UTC m=+583.266275113" observedRunningTime="2026-04-17 17:34:44.412070537 +0000 UTC m=+584.148284369" watchObservedRunningTime="2026-04-17 17:34:44.413069665 +0000 UTC m=+584.149283499" Apr 17 17:34:55.386657 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:55.386628 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-r4qgk" Apr 17 17:34:55.388591 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:34:55.388561 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-v2968" Apr 17 17:38:31.109621 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.109547 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn"] Apr 17 17:38:31.112680 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.112660 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:31.115129 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.115106 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-c3f82-serving-cert\"" Apr 17 17:38:31.115294 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.115123 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:38:31.115294 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.115135 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-c3f82-kube-rbac-proxy-sar-config\"" Apr 17 17:38:31.115294 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.115158 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gt7v2\"" Apr 17 17:38:31.119780 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.119763 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn"] Apr 17 17:38:31.179107 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.179077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f50098f-fd82-495a-9f35-b4beb31cf37a-proxy-tls\") pod \"model-chainer-raw-c3f82-56f64cfd7-dh2jn\" (UID: \"0f50098f-fd82-495a-9f35-b4beb31cf37a\") " pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:31.179262 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.179131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f50098f-fd82-495a-9f35-b4beb31cf37a-openshift-service-ca-bundle\") pod \"model-chainer-raw-c3f82-56f64cfd7-dh2jn\" (UID: \"0f50098f-fd82-495a-9f35-b4beb31cf37a\") " pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:31.279665 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.279625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f50098f-fd82-495a-9f35-b4beb31cf37a-openshift-service-ca-bundle\") pod \"model-chainer-raw-c3f82-56f64cfd7-dh2jn\" (UID: \"0f50098f-fd82-495a-9f35-b4beb31cf37a\") " pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:31.279841 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.279707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f50098f-fd82-495a-9f35-b4beb31cf37a-proxy-tls\") pod \"model-chainer-raw-c3f82-56f64cfd7-dh2jn\" (UID: \"0f50098f-fd82-495a-9f35-b4beb31cf37a\") " pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:31.280276 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.280256 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f50098f-fd82-495a-9f35-b4beb31cf37a-openshift-service-ca-bundle\") pod \"model-chainer-raw-c3f82-56f64cfd7-dh2jn\" (UID: \"0f50098f-fd82-495a-9f35-b4beb31cf37a\") " pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:31.282023 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.282003 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f50098f-fd82-495a-9f35-b4beb31cf37a-proxy-tls\") pod \"model-chainer-raw-c3f82-56f64cfd7-dh2jn\" (UID: \"0f50098f-fd82-495a-9f35-b4beb31cf37a\") " pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:31.422274 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.422203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:31.532629 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.532600 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn"] Apr 17 17:38:31.535336 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:38:31.535311 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f50098f_fd82_495a_9f35_b4beb31cf37a.slice/crio-b1230a124b0181c8b5545cbf3d6ef52ba234738a76405ec2ed7b9cc386b9593f WatchSource:0}: Error finding container b1230a124b0181c8b5545cbf3d6ef52ba234738a76405ec2ed7b9cc386b9593f: Status 404 returned error can't find the container with id b1230a124b0181c8b5545cbf3d6ef52ba234738a76405ec2ed7b9cc386b9593f Apr 17 17:38:31.537233 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.537217 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:38:31.948820 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:31.948789 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" event={"ID":"0f50098f-fd82-495a-9f35-b4beb31cf37a","Type":"ContainerStarted","Data":"b1230a124b0181c8b5545cbf3d6ef52ba234738a76405ec2ed7b9cc386b9593f"} Apr 17 17:38:33.955079 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:33.955045 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" event={"ID":"0f50098f-fd82-495a-9f35-b4beb31cf37a","Type":"ContainerStarted","Data":"84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c"} Apr 17 17:38:33.955446 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:33.955198 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:33.974201 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:33.974135 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" podStartSLOduration=0.72535739 podStartE2EDuration="2.974120229s" podCreationTimestamp="2026-04-17 17:38:31 +0000 UTC" firstStartedPulling="2026-04-17 17:38:31.537351519 +0000 UTC m=+811.273565331" lastFinishedPulling="2026-04-17 17:38:33.786114346 +0000 UTC m=+813.522328170" observedRunningTime="2026-04-17 17:38:33.9724406 +0000 UTC m=+813.708654434" watchObservedRunningTime="2026-04-17 17:38:33.974120229 +0000 UTC m=+813.710334096" Apr 17 17:38:39.963292 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:39.963259 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:41.190911 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.190882 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn"] Apr 17 17:38:41.191350 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.191067 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" containerID="cri-o://84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c" gracePeriod=30 Apr 17 17:38:41.544814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.544782 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh"] Apr 17 17:38:41.547958 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.547938 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.550446 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.550409 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\"" Apr 17 17:38:41.550446 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.550409 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:38:41.550608 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.550482 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-raw-hpa-4dd55-predictor-serving-cert\"" Apr 17 17:38:41.557572 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.557550 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh"] Apr 17 17:38:41.654373 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.654340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.654533 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.654381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.654533 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.654449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.654533 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.654466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr28s\" (UniqueName: \"kubernetes.io/projected/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kube-api-access-sr28s\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.755126 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.755097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.755305 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.755158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.755305 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.755205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr28s\" (UniqueName: \"kubernetes.io/projected/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kube-api-access-sr28s\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.755305 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.755225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.755655 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.755638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.755800 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.755780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.757517 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.757500 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-proxy-tls\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.764076 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.764055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr28s\" (UniqueName: \"kubernetes.io/projected/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kube-api-access-sr28s\") pod \"isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.858365 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.858290 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:38:41.974952 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:41.974920 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh"] Apr 17 17:38:41.978570 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:38:41.978542 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc03a7bf3_0e30_4a2d_8673_4bc4e1b27217.slice/crio-4df072614d4965a91d379ddd0614bbe1e701fb0a17d74b6378b3123fb25bb421 WatchSource:0}: Error finding container 4df072614d4965a91d379ddd0614bbe1e701fb0a17d74b6378b3123fb25bb421: Status 404 returned error can't find the container with id 4df072614d4965a91d379ddd0614bbe1e701fb0a17d74b6378b3123fb25bb421 Apr 17 17:38:42.982457 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:42.982414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" event={"ID":"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217","Type":"ContainerStarted","Data":"4df072614d4965a91d379ddd0614bbe1e701fb0a17d74b6378b3123fb25bb421"} Apr 17 17:38:44.961808 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:44.961765 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:38:44.989621 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:44.989581 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" event={"ID":"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217","Type":"ContainerStarted","Data":"8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46"} Apr 17 17:38:49.001326 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:49.001244 2574 generic.go:358] "Generic (PLEG): container finished" podID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerID="8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46" exitCode=0 Apr 17 17:38:49.001672 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:49.001316 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" event={"ID":"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217","Type":"ContainerDied","Data":"8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46"} Apr 17 17:38:49.963037 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:49.962992 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:38:54.962786 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:54.962746 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:38:54.963244 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:54.962861 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:38:59.963526 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:38:59.963481 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:39:04.962562 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:04.962518 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:39:05.055801 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:05.055769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" event={"ID":"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217","Type":"ContainerStarted","Data":"abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e"} Apr 17 17:39:08.066601 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:08.066570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" event={"ID":"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217","Type":"ContainerStarted","Data":"84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49"} Apr 17 17:39:08.066975 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:08.066688 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:39:08.087082 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:08.087043 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podStartSLOduration=1.73480319 podStartE2EDuration="27.087030353s" podCreationTimestamp="2026-04-17 17:38:41 +0000 UTC" firstStartedPulling="2026-04-17 17:38:41.980290356 +0000 UTC m=+821.716504172" lastFinishedPulling="2026-04-17 17:39:07.332517523 +0000 UTC m=+847.068731335" observedRunningTime="2026-04-17 17:39:08.085538272 +0000 UTC m=+847.821752117" watchObservedRunningTime="2026-04-17 17:39:08.087030353 +0000 UTC m=+847.823244186" Apr 17 17:39:09.068890 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:09.068859 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:39:09.069843 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:09.069818 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 17:39:09.962473 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:09.962440 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:39:10.071061 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:10.071027 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 17:39:11.219067 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:39:11.217258 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f50098f_fd82_495a_9f35_b4beb31cf37a.slice/crio-b1230a124b0181c8b5545cbf3d6ef52ba234738a76405ec2ed7b9cc386b9593f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f50098f_fd82_495a_9f35_b4beb31cf37a.slice/crio-84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f50098f_fd82_495a_9f35_b4beb31cf37a.slice/crio-conmon-84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:39:11.220095 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:39:11.220074 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f50098f_fd82_495a_9f35_b4beb31cf37a.slice/crio-conmon-84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:39:11.322913 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:11.322891 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:39:11.495041 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:11.494958 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f50098f-fd82-495a-9f35-b4beb31cf37a-proxy-tls\") pod \"0f50098f-fd82-495a-9f35-b4beb31cf37a\" (UID: \"0f50098f-fd82-495a-9f35-b4beb31cf37a\") " Apr 17 17:39:11.495231 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:11.495054 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f50098f-fd82-495a-9f35-b4beb31cf37a-openshift-service-ca-bundle\") pod \"0f50098f-fd82-495a-9f35-b4beb31cf37a\" (UID: \"0f50098f-fd82-495a-9f35-b4beb31cf37a\") " Apr 17 17:39:11.495424 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:11.495397 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f50098f-fd82-495a-9f35-b4beb31cf37a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "0f50098f-fd82-495a-9f35-b4beb31cf37a" (UID: "0f50098f-fd82-495a-9f35-b4beb31cf37a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:39:11.497003 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:11.496980 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f50098f-fd82-495a-9f35-b4beb31cf37a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0f50098f-fd82-495a-9f35-b4beb31cf37a" (UID: "0f50098f-fd82-495a-9f35-b4beb31cf37a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:39:11.596012 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:11.595974 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f50098f-fd82-495a-9f35-b4beb31cf37a-openshift-service-ca-bundle\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:39:11.596012 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:11.596007 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0f50098f-fd82-495a-9f35-b4beb31cf37a-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:39:12.077018 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.076981 2574 generic.go:358] "Generic (PLEG): container finished" podID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerID="84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c" exitCode=0 Apr 17 17:39:12.077285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.077037 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" event={"ID":"0f50098f-fd82-495a-9f35-b4beb31cf37a","Type":"ContainerDied","Data":"84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c"} Apr 17 17:39:12.077285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.077064 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" event={"ID":"0f50098f-fd82-495a-9f35-b4beb31cf37a","Type":"ContainerDied","Data":"b1230a124b0181c8b5545cbf3d6ef52ba234738a76405ec2ed7b9cc386b9593f"} Apr 17 17:39:12.077285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.077080 2574 scope.go:117] "RemoveContainer" containerID="84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c" Apr 17 17:39:12.077285 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.077039 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn" Apr 17 17:39:12.084728 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.084574 2574 scope.go:117] "RemoveContainer" containerID="84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c" Apr 17 17:39:12.084923 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:39:12.084901 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c\": container with ID starting with 84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c not found: ID does not exist" containerID="84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c" Apr 17 17:39:12.085003 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.084930 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c"} err="failed to get container status \"84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c\": rpc error: code = NotFound desc = could not find container \"84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c\": container with ID starting with 84ba6baa7c5a39a49d0f6f0cbda547946dffb495d9806ac2d08c6003e270482c not found: ID does not exist" Apr 17 17:39:12.097517 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.097495 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn"] Apr 17 17:39:12.099352 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.099332 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-c3f82-56f64cfd7-dh2jn"] Apr 17 17:39:12.856704 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:12.856677 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" path="/var/lib/kubelet/pods/0f50098f-fd82-495a-9f35-b4beb31cf37a/volumes" Apr 17 17:39:15.075538 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:15.075510 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:39:15.076081 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:15.076057 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 17:39:25.076859 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:25.076820 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 17:39:35.076662 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:35.076624 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 17:39:45.076387 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:45.076348 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 17:39:55.076924 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:39:55.076845 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 17 17:40:05.077117 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:05.077085 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:40:11.414893 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.414858 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2"] Apr 17 17:40:11.415254 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.415117 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" Apr 17 17:40:11.415254 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.415127 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" Apr 17 17:40:11.415254 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.415188 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f50098f-fd82-495a-9f35-b4beb31cf37a" containerName="model-chainer-raw-c3f82" Apr 17 17:40:11.418240 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.418224 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:11.420655 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.420633 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-4dd55-kube-rbac-proxy-sar-config\"" Apr 17 17:40:11.420762 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.420670 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-4dd55-serving-cert\"" Apr 17 17:40:11.426394 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.426368 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2"] Apr 17 17:40:11.518479 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.518450 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls\") pod \"model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:11.518636 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.518488 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40c6001c-79e9-48e8-b933-728e76c92b38-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:11.619782 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.619750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls\") pod \"model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:11.619940 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.619799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40c6001c-79e9-48e8-b933-728e76c92b38-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:11.619940 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:40:11.619899 2574 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-serving-cert: secret "model-chainer-raw-hpa-4dd55-serving-cert" not found Apr 17 17:40:11.620046 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:40:11.619973 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls podName:40c6001c-79e9-48e8-b933-728e76c92b38 nodeName:}" failed. No retries permitted until 2026-04-17 17:40:12.11995728 +0000 UTC m=+911.856171097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls") pod "model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" (UID: "40c6001c-79e9-48e8-b933-728e76c92b38") : secret "model-chainer-raw-hpa-4dd55-serving-cert" not found Apr 17 17:40:11.620478 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:11.620455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40c6001c-79e9-48e8-b933-728e76c92b38-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:12.123870 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:12.123840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls\") pod \"model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:12.126180 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:12.126141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls\") pod \"model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:12.328607 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:12.328571 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:12.440308 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:12.440282 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2"] Apr 17 17:40:12.442880 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:40:12.442856 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40c6001c_79e9_48e8_b933_728e76c92b38.slice/crio-2ee02ec25e67f2dc87495f694af5a85352ec39a8b67bd42a694d8e7176997ec7 WatchSource:0}: Error finding container 2ee02ec25e67f2dc87495f694af5a85352ec39a8b67bd42a694d8e7176997ec7: Status 404 returned error can't find the container with id 2ee02ec25e67f2dc87495f694af5a85352ec39a8b67bd42a694d8e7176997ec7 Apr 17 17:40:13.231891 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:13.231855 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" event={"ID":"40c6001c-79e9-48e8-b933-728e76c92b38","Type":"ContainerStarted","Data":"02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe"} Apr 17 17:40:13.231891 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:13.231894 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" event={"ID":"40c6001c-79e9-48e8-b933-728e76c92b38","Type":"ContainerStarted","Data":"2ee02ec25e67f2dc87495f694af5a85352ec39a8b67bd42a694d8e7176997ec7"} Apr 17 17:40:13.232154 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:13.231922 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:13.249334 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:13.249277 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" podStartSLOduration=2.249264667 podStartE2EDuration="2.249264667s" podCreationTimestamp="2026-04-17 17:40:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:40:13.248506071 +0000 UTC m=+912.984719905" watchObservedRunningTime="2026-04-17 17:40:13.249264667 +0000 UTC m=+912.985478500" Apr 17 17:40:19.240041 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:19.240014 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:21.471448 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:21.471414 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2"] Apr 17 17:40:21.471824 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:21.471651 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" containerID="cri-o://02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe" gracePeriod=30 Apr 17 17:40:21.688489 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:21.688452 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh"] Apr 17 17:40:21.688853 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:21.688830 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" containerID="cri-o://abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e" gracePeriod=30 Apr 17 17:40:21.688948 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:21.688879 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kube-rbac-proxy" containerID="cri-o://84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49" gracePeriod=30 Apr 17 17:40:22.261303 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:22.261274 2574 generic.go:358] "Generic (PLEG): container finished" podID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerID="84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49" exitCode=2 Apr 17 17:40:22.261456 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:22.261328 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" event={"ID":"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217","Type":"ContainerDied","Data":"84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49"} Apr 17 17:40:24.238568 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:24.238530 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:25.022062 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.022039 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:40:25.113946 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.113874 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-proxy-tls\") pod \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " Apr 17 17:40:25.113946 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.113911 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr28s\" (UniqueName: \"kubernetes.io/projected/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kube-api-access-sr28s\") pod \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " Apr 17 17:40:25.113946 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.113939 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kserve-provision-location\") pod \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " Apr 17 17:40:25.114200 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.114021 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") pod \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\" (UID: \"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217\") " Apr 17 17:40:25.114359 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.114329 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" (UID: "c03a7bf3-0e30-4a2d-8673-4bc4e1b27217"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:40:25.114416 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.114342 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config") pod "c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" (UID: "c03a7bf3-0e30-4a2d-8673-4bc4e1b27217"). InnerVolumeSpecName "isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:40:25.115881 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.115859 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kube-api-access-sr28s" (OuterVolumeSpecName: "kube-api-access-sr28s") pod "c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" (UID: "c03a7bf3-0e30-4a2d-8673-4bc4e1b27217"). InnerVolumeSpecName "kube-api-access-sr28s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:40:25.115930 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.115863 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" (UID: "c03a7bf3-0e30-4a2d-8673-4bc4e1b27217"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:40:25.215191 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.215139 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-isvc-xgboost-graph-raw-hpa-4dd55-kube-rbac-proxy-sar-config\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:40:25.215191 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.215185 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:40:25.215191 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.215197 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sr28s\" (UniqueName: \"kubernetes.io/projected/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kube-api-access-sr28s\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:40:25.215400 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.215207 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217-kserve-provision-location\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:40:25.270765 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.270731 2574 generic.go:358] "Generic (PLEG): container finished" podID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerID="abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e" exitCode=0 Apr 17 17:40:25.271159 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.270822 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" Apr 17 17:40:25.271159 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.270814 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" event={"ID":"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217","Type":"ContainerDied","Data":"abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e"} Apr 17 17:40:25.271159 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.270935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh" event={"ID":"c03a7bf3-0e30-4a2d-8673-4bc4e1b27217","Type":"ContainerDied","Data":"4df072614d4965a91d379ddd0614bbe1e701fb0a17d74b6378b3123fb25bb421"} Apr 17 17:40:25.271159 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.270959 2574 scope.go:117] "RemoveContainer" containerID="84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49" Apr 17 17:40:25.278826 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.278806 2574 scope.go:117] "RemoveContainer" containerID="abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e" Apr 17 17:40:25.285158 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.285141 2574 scope.go:117] "RemoveContainer" containerID="8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46" Apr 17 17:40:25.290883 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.290862 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh"] Apr 17 17:40:25.292320 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.292298 2574 scope.go:117] "RemoveContainer" containerID="84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49" Apr 17 17:40:25.292574 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:40:25.292557 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49\": container with ID starting with 84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49 not found: ID does not exist" containerID="84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49" Apr 17 17:40:25.292625 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.292581 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49"} err="failed to get container status \"84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49\": rpc error: code = NotFound desc = could not find container \"84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49\": container with ID starting with 84528527b229432d99db437ef11618e0970414ae642ce785422ff6d741bfab49 not found: ID does not exist" Apr 17 17:40:25.292625 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.292598 2574 scope.go:117] "RemoveContainer" containerID="abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e" Apr 17 17:40:25.292812 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:40:25.292799 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e\": container with ID starting with abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e not found: ID does not exist" containerID="abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e" Apr 17 17:40:25.292846 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.292815 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e"} err="failed to get container status \"abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e\": rpc error: code = NotFound desc = could not find container \"abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e\": container with ID starting with abba50cd7cea2636749d22d7c25cb800a0c51e8bf268c5410ad0a66882ebb71e not found: ID does not exist" Apr 17 17:40:25.292846 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.292827 2574 scope.go:117] "RemoveContainer" containerID="8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46" Apr 17 17:40:25.293019 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:40:25.293002 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46\": container with ID starting with 8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46 not found: ID does not exist" containerID="8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46" Apr 17 17:40:25.293055 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.293024 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46"} err="failed to get container status \"8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46\": rpc error: code = NotFound desc = could not find container \"8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46\": container with ID starting with 8327754b84aaad0ab7991531d48da6fe7d1a839bf53d1d572df5aecfdbdbad46 not found: ID does not exist" Apr 17 17:40:25.296444 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:25.296425 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-4dd55-predictor-7866c6c885-q2blh"] Apr 17 17:40:26.859152 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:26.859120 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" path="/var/lib/kubelet/pods/c03a7bf3-0e30-4a2d-8673-4bc4e1b27217/volumes" Apr 17 17:40:29.237956 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:29.237918 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:34.238146 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:34.238109 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:34.238614 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:34.238245 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:39.239209 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:39.239148 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:44.238767 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:44.238724 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:49.238830 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:49.238789 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:52.105821 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.105796 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:52.303336 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.303303 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40c6001c-79e9-48e8-b933-728e76c92b38-openshift-service-ca-bundle\") pod \"40c6001c-79e9-48e8-b933-728e76c92b38\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " Apr 17 17:40:52.303501 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.303348 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls\") pod \"40c6001c-79e9-48e8-b933-728e76c92b38\" (UID: \"40c6001c-79e9-48e8-b933-728e76c92b38\") " Apr 17 17:40:52.303662 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.303639 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c6001c-79e9-48e8-b933-728e76c92b38-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "40c6001c-79e9-48e8-b933-728e76c92b38" (UID: "40c6001c-79e9-48e8-b933-728e76c92b38"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:40:52.305382 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.305357 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "40c6001c-79e9-48e8-b933-728e76c92b38" (UID: "40c6001c-79e9-48e8-b933-728e76c92b38"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:40:52.345419 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.345385 2574 generic.go:358] "Generic (PLEG): container finished" podID="40c6001c-79e9-48e8-b933-728e76c92b38" containerID="02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe" exitCode=0 Apr 17 17:40:52.345569 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.345441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" event={"ID":"40c6001c-79e9-48e8-b933-728e76c92b38","Type":"ContainerDied","Data":"02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe"} Apr 17 17:40:52.345569 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.345469 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" event={"ID":"40c6001c-79e9-48e8-b933-728e76c92b38","Type":"ContainerDied","Data":"2ee02ec25e67f2dc87495f694af5a85352ec39a8b67bd42a694d8e7176997ec7"} Apr 17 17:40:52.345569 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.345484 2574 scope.go:117] "RemoveContainer" containerID="02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe" Apr 17 17:40:52.345569 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.345446 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2" Apr 17 17:40:52.353284 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.353266 2574 scope.go:117] "RemoveContainer" containerID="02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe" Apr 17 17:40:52.353567 ip-10-0-135-127 kubenswrapper[2574]: E0417 17:40:52.353546 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe\": container with ID starting with 02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe not found: ID does not exist" containerID="02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe" Apr 17 17:40:52.353643 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.353578 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe"} err="failed to get container status \"02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe\": rpc error: code = NotFound desc = could not find container \"02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe\": container with ID starting with 02a30d284b866f064b9fbca7308f315e935609bda4a46b6300666ed005e0efbe not found: ID does not exist" Apr 17 17:40:52.366534 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.366509 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2"] Apr 17 17:40:52.369558 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.369536 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4dd55-6dfcf75dc5-bc9f2"] Apr 17 17:40:52.404257 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.404229 2574 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40c6001c-79e9-48e8-b933-728e76c92b38-openshift-service-ca-bundle\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:40:52.404257 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.404253 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40c6001c-79e9-48e8-b933-728e76c92b38-proxy-tls\") on node \"ip-10-0-135-127.ec2.internal\" DevicePath \"\"" Apr 17 17:40:52.857162 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:40:52.857126 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" path="/var/lib/kubelet/pods/40c6001c-79e9-48e8-b933-728e76c92b38/volumes" Apr 17 17:49:29.625910 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.625872 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7ttz7/must-gather-bvhwv"] Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626255 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626273 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626296 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="storage-initializer" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626305 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="storage-initializer" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626321 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kube-rbac-proxy" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626329 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kube-rbac-proxy" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626340 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626348 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626411 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="40c6001c-79e9-48e8-b933-728e76c92b38" containerName="model-chainer-raw-hpa-4dd55" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626425 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kube-rbac-proxy" Apr 17 17:49:29.626458 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.626434 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c03a7bf3-0e30-4a2d-8673-4bc4e1b27217" containerName="kserve-container" Apr 17 17:49:29.629289 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.629270 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ttz7/must-gather-bvhwv" Apr 17 17:49:29.632160 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.632134 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7ttz7\"/\"kube-root-ca.crt\"" Apr 17 17:49:29.632160 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.632146 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7ttz7\"/\"openshift-service-ca.crt\"" Apr 17 17:49:29.633114 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.633098 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7ttz7\"/\"default-dockercfg-ml4hx\"" Apr 17 17:49:29.638795 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.638774 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ttz7/must-gather-bvhwv"] Apr 17 17:49:29.738082 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.738055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62a550da-3ce4-4bf7-9734-e38fc659b4c5-must-gather-output\") pod \"must-gather-bvhwv\" (UID: \"62a550da-3ce4-4bf7-9734-e38fc659b4c5\") " pod="openshift-must-gather-7ttz7/must-gather-bvhwv" Apr 17 17:49:29.738244 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.738093 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5j7f\" (UniqueName: \"kubernetes.io/projected/62a550da-3ce4-4bf7-9734-e38fc659b4c5-kube-api-access-g5j7f\") pod \"must-gather-bvhwv\" (UID: \"62a550da-3ce4-4bf7-9734-e38fc659b4c5\") " pod="openshift-must-gather-7ttz7/must-gather-bvhwv" Apr 17 17:49:29.838900 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.838871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5j7f\" (UniqueName: \"kubernetes.io/projected/62a550da-3ce4-4bf7-9734-e38fc659b4c5-kube-api-access-g5j7f\") pod \"must-gather-bvhwv\" (UID: \"62a550da-3ce4-4bf7-9734-e38fc659b4c5\") " pod="openshift-must-gather-7ttz7/must-gather-bvhwv" Apr 17 17:49:29.839042 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.838933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62a550da-3ce4-4bf7-9734-e38fc659b4c5-must-gather-output\") pod \"must-gather-bvhwv\" (UID: \"62a550da-3ce4-4bf7-9734-e38fc659b4c5\") " pod="openshift-must-gather-7ttz7/must-gather-bvhwv" Apr 17 17:49:29.839231 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.839216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/62a550da-3ce4-4bf7-9734-e38fc659b4c5-must-gather-output\") pod \"must-gather-bvhwv\" (UID: \"62a550da-3ce4-4bf7-9734-e38fc659b4c5\") " pod="openshift-must-gather-7ttz7/must-gather-bvhwv" Apr 17 17:49:29.847615 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.847594 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5j7f\" (UniqueName: \"kubernetes.io/projected/62a550da-3ce4-4bf7-9734-e38fc659b4c5-kube-api-access-g5j7f\") pod \"must-gather-bvhwv\" (UID: \"62a550da-3ce4-4bf7-9734-e38fc659b4c5\") " pod="openshift-must-gather-7ttz7/must-gather-bvhwv" Apr 17 17:49:29.938116 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:29.938049 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ttz7/must-gather-bvhwv" Apr 17 17:49:30.049353 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:30.049329 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ttz7/must-gather-bvhwv"] Apr 17 17:49:30.051211 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:49:30.051179 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62a550da_3ce4_4bf7_9734_e38fc659b4c5.slice/crio-00d18de15a9af2c537ec3c71708172767a8e98afa39a26d910261d8b8d6895a5 WatchSource:0}: Error finding container 00d18de15a9af2c537ec3c71708172767a8e98afa39a26d910261d8b8d6895a5: Status 404 returned error can't find the container with id 00d18de15a9af2c537ec3c71708172767a8e98afa39a26d910261d8b8d6895a5 Apr 17 17:49:30.052843 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:30.052823 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:49:30.670638 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:30.670597 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/must-gather-bvhwv" event={"ID":"62a550da-3ce4-4bf7-9734-e38fc659b4c5","Type":"ContainerStarted","Data":"00d18de15a9af2c537ec3c71708172767a8e98afa39a26d910261d8b8d6895a5"} Apr 17 17:49:31.676455 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:31.676408 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/must-gather-bvhwv" event={"ID":"62a550da-3ce4-4bf7-9734-e38fc659b4c5","Type":"ContainerStarted","Data":"a0f364d43221059178bc05b4c651cb70d867aaff1700ae5ea9e3e1abeabd882d"} Apr 17 17:49:31.676455 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:31.676459 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/must-gather-bvhwv" event={"ID":"62a550da-3ce4-4bf7-9734-e38fc659b4c5","Type":"ContainerStarted","Data":"fff6872c8be2e7362e4ad83fb9dc6a1ccf18ca601a64aab3dd3cf9c959499e1f"} Apr 17 17:49:31.695679 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:31.695616 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7ttz7/must-gather-bvhwv" podStartSLOduration=1.8275555319999999 podStartE2EDuration="2.695598735s" podCreationTimestamp="2026-04-17 17:49:29 +0000 UTC" firstStartedPulling="2026-04-17 17:49:30.052946448 +0000 UTC m=+1469.789160260" lastFinishedPulling="2026-04-17 17:49:30.920989646 +0000 UTC m=+1470.657203463" observedRunningTime="2026-04-17 17:49:31.693499315 +0000 UTC m=+1471.429713150" watchObservedRunningTime="2026-04-17 17:49:31.695598735 +0000 UTC m=+1471.431812570" Apr 17 17:49:32.296520 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:32.296485 2574 ???:1] "http: TLS handshake error from 10.0.135.127:55836: EOF" Apr 17 17:49:32.299430 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:32.299405 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ftpnl_342d0e0b-38a9-4fb2-a76e-aa5459a12a9e/global-pull-secret-syncer/0.log" Apr 17 17:49:32.448794 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:32.448764 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nrxzd_2a664808-7e4f-495a-bd8e-3278a11bb604/konnectivity-agent/0.log" Apr 17 17:49:32.562272 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:32.562162 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-127.ec2.internal_356446819b043d77b4ba2d5504f23404/haproxy/0.log" Apr 17 17:49:36.024754 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:36.024717 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9k9j9_9de9badb-14ff-4855-9c22-842335059617/node-exporter/0.log" Apr 17 17:49:36.045015 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:36.044977 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9k9j9_9de9badb-14ff-4855-9c22-842335059617/kube-rbac-proxy/0.log" Apr 17 17:49:36.068349 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:36.068313 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9k9j9_9de9badb-14ff-4855-9c22-842335059617/init-textfile/0.log" Apr 17 17:49:39.198814 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.198778 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx"] Apr 17 17:49:39.203124 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.203104 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.210083 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.210060 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx"] Apr 17 17:49:39.315897 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.315863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhn6\" (UniqueName: \"kubernetes.io/projected/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-kube-api-access-jzhn6\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.316059 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.315904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-proc\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.316059 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.315936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-sys\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.316059 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.315984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-podres\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.316198 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.316059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-lib-modules\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417386 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-podres\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417555 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-lib-modules\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417555 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417499 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-podres\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417555 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417524 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhn6\" (UniqueName: \"kubernetes.io/projected/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-kube-api-access-jzhn6\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417555 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-proc\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417745 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-sys\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417745 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-lib-modules\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417745 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417634 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-proc\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.417745 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.417681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-sys\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.425164 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.425137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhn6\" (UniqueName: \"kubernetes.io/projected/41d0b23b-c5d3-4bc9-9cee-86686bf687ec-kube-api-access-jzhn6\") pod \"perf-node-gather-daemonset-n5bpx\" (UID: \"41d0b23b-c5d3-4bc9-9cee-86686bf687ec\") " pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.515581 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.515542 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:39.653449 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.653416 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx"] Apr 17 17:49:39.656982 ip-10-0-135-127 kubenswrapper[2574]: W0417 17:49:39.656955 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod41d0b23b_c5d3_4bc9_9cee_86686bf687ec.slice/crio-cac2758ad84e7f498bdfbe9df4c17b4e6679e433ec08908ff8fd1da52489b6b4 WatchSource:0}: Error finding container cac2758ad84e7f498bdfbe9df4c17b4e6679e433ec08908ff8fd1da52489b6b4: Status 404 returned error can't find the container with id cac2758ad84e7f498bdfbe9df4c17b4e6679e433ec08908ff8fd1da52489b6b4 Apr 17 17:49:39.707187 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.707146 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" event={"ID":"41d0b23b-c5d3-4bc9-9cee-86686bf687ec","Type":"ContainerStarted","Data":"cac2758ad84e7f498bdfbe9df4c17b4e6679e433ec08908ff8fd1da52489b6b4"} Apr 17 17:49:39.893188 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.893109 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g8tgx_cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01/dns/0.log" Apr 17 17:49:39.912138 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.912116 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-g8tgx_cceba8d9-fe83-48a9-9faf-5ce2fdf1dc01/kube-rbac-proxy/0.log" Apr 17 17:49:39.977160 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:39.977134 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4dfq5_d31b25a9-8351-4624-8ef6-a1389bdd2474/dns-node-resolver/0.log" Apr 17 17:49:40.444500 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:40.444472 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d75sg_02498340-44b9-4152-9802-82fbeecce918/node-ca/0.log" Apr 17 17:49:40.711297 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:40.711210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" event={"ID":"41d0b23b-c5d3-4bc9-9cee-86686bf687ec","Type":"ContainerStarted","Data":"a285f85cf1ecbcab58c0147e19aa915cf131f6d884d1308d256e0f8cfbc24ee6"} Apr 17 17:49:40.711466 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:40.711309 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:40.728355 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:40.728300 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" podStartSLOduration=1.728281073 podStartE2EDuration="1.728281073s" podCreationTimestamp="2026-04-17 17:49:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:49:40.727527546 +0000 UTC m=+1480.463741381" watchObservedRunningTime="2026-04-17 17:49:40.728281073 +0000 UTC m=+1480.464494909" Apr 17 17:49:41.475692 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:41.475664 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9mzwz_a5a0550e-4a4c-4a4b-841e-64468d8467ce/serve-healthcheck-canary/0.log" Apr 17 17:49:41.987021 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:41.986992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cr9jd_737470dd-5a5c-4575-9059-1060d3ebbea6/kube-rbac-proxy/0.log" Apr 17 17:49:42.006605 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:42.006577 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cr9jd_737470dd-5a5c-4575-9059-1060d3ebbea6/exporter/0.log" Apr 17 17:49:42.025884 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:42.025858 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cr9jd_737470dd-5a5c-4575-9059-1060d3ebbea6/extractor/0.log" Apr 17 17:49:44.030777 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:44.030752 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-v2968_da2262ce-90d6-4ede-88fb-401c133ac820/server/0.log" Apr 17 17:49:44.122292 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:44.122261 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-r4qgk_03ac2e31-d2ba-494b-87d2-c587f8c20f12/manager/0.log" Apr 17 17:49:44.165536 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:44.165511 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-w4vx9_545651ab-36d6-443f-8b27-1c4c851d22ce/seaweedfs/0.log" Apr 17 17:49:46.725123 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:46.725094 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7ttz7/perf-node-gather-daemonset-n5bpx" Apr 17 17:49:49.157110 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.157043 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5b72d_807ce854-eb81-42f4-8fb8-0060d033ffbf/kube-multus/0.log" Apr 17 17:49:49.329151 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.329123 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cf28x_474e9a38-21a3-415a-a945-80417640d569/kube-multus-additional-cni-plugins/0.log" Apr 17 17:49:49.349579 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.349554 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cf28x_474e9a38-21a3-415a-a945-80417640d569/egress-router-binary-copy/0.log" Apr 17 17:49:49.369261 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.369239 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cf28x_474e9a38-21a3-415a-a945-80417640d569/cni-plugins/0.log" Apr 17 17:49:49.389505 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.389481 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cf28x_474e9a38-21a3-415a-a945-80417640d569/bond-cni-plugin/0.log" Apr 17 17:49:49.408637 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.408575 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cf28x_474e9a38-21a3-415a-a945-80417640d569/routeoverride-cni/0.log" Apr 17 17:49:49.428349 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.428331 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cf28x_474e9a38-21a3-415a-a945-80417640d569/whereabouts-cni-bincopy/0.log" Apr 17 17:49:49.450794 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.450772 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cf28x_474e9a38-21a3-415a-a945-80417640d569/whereabouts-cni/0.log" Apr 17 17:49:49.715658 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.715620 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cqtr2_f3033f4c-b4a1-45de-8f08-0fbf65425c86/network-metrics-daemon/0.log" Apr 17 17:49:49.734920 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:49.734897 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cqtr2_f3033f4c-b4a1-45de-8f08-0fbf65425c86/kube-rbac-proxy/0.log" Apr 17 17:49:51.202256 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:51.202229 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sgzx2_110a4c18-b7af-4bb1-8f5e-f332eb485ccb/ovn-controller/0.log" Apr 17 17:49:51.233582 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:51.233560 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sgzx2_110a4c18-b7af-4bb1-8f5e-f332eb485ccb/ovn-acl-logging/0.log" Apr 17 17:49:51.254453 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:51.254431 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sgzx2_110a4c18-b7af-4bb1-8f5e-f332eb485ccb/kube-rbac-proxy-node/0.log" Apr 17 17:49:51.272916 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:51.272893 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sgzx2_110a4c18-b7af-4bb1-8f5e-f332eb485ccb/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:49:51.289938 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:51.289915 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sgzx2_110a4c18-b7af-4bb1-8f5e-f332eb485ccb/northd/0.log" Apr 17 17:49:51.310389 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:51.310368 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sgzx2_110a4c18-b7af-4bb1-8f5e-f332eb485ccb/nbdb/0.log" Apr 17 17:49:51.330109 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:51.330089 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sgzx2_110a4c18-b7af-4bb1-8f5e-f332eb485ccb/sbdb/0.log" Apr 17 17:49:51.424010 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:51.423986 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sgzx2_110a4c18-b7af-4bb1-8f5e-f332eb485ccb/ovnkube-controller/0.log" Apr 17 17:49:52.394923 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:52.394888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vvm4h_4b83a9e4-5073-4105-bc72-4980376e169f/network-check-target-container/0.log" Apr 17 17:49:53.344802 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:53.344776 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-wt5ct_01868bf7-f6d2-461d-8bf1-006126117f62/iptables-alerter/0.log" Apr 17 17:49:53.912678 ip-10-0-135-127 kubenswrapper[2574]: I0417 17:49:53.912649 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-5wf77_bdfc917a-4e35-4bac-8c08-84c70e29539e/tuned/0.log"