Apr 22 18:35:57.815210 ip-10-0-134-126 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:35:58.288228 ip-10-0-134-126 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:35:58.288228 ip-10-0-134-126 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:35:58.288228 ip-10-0-134-126 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:35:58.288228 ip-10-0-134-126 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:35:58.288228 ip-10-0-134-126 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:35:58.290244 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.290146 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:35:58.293372 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293355 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293374 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293378 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293381 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293384 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293387 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293391 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293394 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293396 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293399 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293402 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293404 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293407 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293411 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293416 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:35:58.293413 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293419 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293422 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293425 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293428 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293430 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293433 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293435 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293438 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293441 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293443 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293446 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293449 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293453 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293455 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293458 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293462 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293464 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293467 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293470 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:35:58.293768 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293472 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293475 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293477 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293480 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293482 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293485 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293487 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293490 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293493 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293496 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293498 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293500 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293503 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293505 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293508 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293518 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293521 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293523 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293526 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:35:58.294240 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293529 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293531 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293534 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293537 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293539 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293542 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293545 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293548 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293551 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293553 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293557 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293559 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293562 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293564 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293567 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293569 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293571 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293574 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293576 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293579 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:35:58.294705 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293581 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293585 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293587 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293590 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293592 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293595 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293598 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293600 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293603 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293606 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293609 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293611 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.293614 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294050 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294056 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294060 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294064 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294067 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294070 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294073 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:35:58.295215 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294075 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294078 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294080 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294083 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294086 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294088 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294091 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294093 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294096 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294098 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294100 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294103 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294106 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294109 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294112 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294114 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294117 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294120 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294122 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294125 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:35:58.295701 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294129 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294132 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294134 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294137 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294139 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294142 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294144 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294147 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294149 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294152 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294154 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294157 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294160 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294162 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294165 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294167 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294170 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294172 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294175 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:35:58.296296 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294177 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294179 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294183 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294185 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294188 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294191 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294193 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294196 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294199 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294201 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294204 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294206 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294209 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294212 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294215 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294217 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294220 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294223 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294225 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:35:58.296760 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294228 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294230 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294233 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294235 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294238 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294240 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294243 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294247 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294251 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294254 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294257 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294260 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294264 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294267 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294270 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294274 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294276 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294279 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294282 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294285 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:35:58.297239 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.294288 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294363 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294370 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294378 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294385 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294392 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294398 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294405 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294411 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294414 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294417 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294421 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294425 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294428 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294431 2571 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294435 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294438 2571 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294441 2571 flags.go:64] FLAG: --cloud-config="" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294444 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294447 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294451 2571 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294454 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294458 2571 flags.go:64] FLAG: --config-dir="" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294460 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294464 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:35:58.297748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294468 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294471 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294474 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294478 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294481 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294484 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294486 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294490 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294493 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294499 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294502 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294505 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294508 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294512 2571 flags.go:64] FLAG: --enable-server="true" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294515 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294518 2571 flags.go:64] FLAG: --event-burst="100" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294522 2571 flags.go:64] FLAG: --event-qps="50" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294525 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294528 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294531 2571 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294535 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294538 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294541 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294544 2571 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294547 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:35:58.298367 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294550 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294553 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294556 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294558 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294562 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294565 2571 flags.go:64] FLAG: --feature-gates="" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294569 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294572 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294575 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294578 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294581 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294584 2571 flags.go:64] FLAG: --help="false" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294587 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294590 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294593 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294596 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294600 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294604 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294607 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294612 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294615 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294618 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294621 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294624 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:35:58.299024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294627 2571 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294630 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294633 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294636 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294639 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294642 2571 flags.go:64] FLAG: --lock-file="" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294645 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294648 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294651 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294657 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294660 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294663 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294665 2571 flags.go:64] FLAG: --logging-format="text" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294668 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294672 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294674 2571 flags.go:64] FLAG: --manifest-url="" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294677 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294682 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294685 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294689 2571 flags.go:64] FLAG: --max-pods="110" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294692 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294695 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294698 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294701 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294704 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:35:58.299608 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294707 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294711 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294721 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294724 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294727 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294730 2571 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294733 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294738 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294741 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294744 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294747 2571 flags.go:64] FLAG: --port="10250" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294750 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294753 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b0fb851764523255" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294756 2571 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294759 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294762 2571 flags.go:64] FLAG: --register-node="true" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294765 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294768 2571 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294772 2571 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294775 2571 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294778 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294780 2571 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294784 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294787 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294790 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:35:58.300223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294793 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294796 2571 flags.go:64] FLAG: --runonce="false" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294799 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294802 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294805 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294808 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294811 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294814 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294818 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294823 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294826 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294829 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294832 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294835 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294838 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294841 2571 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294843 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294849 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294852 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294855 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294859 2571 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294862 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294877 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294881 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294884 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:35:58.300855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294886 2571 flags.go:64] FLAG: --v="2" Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294891 2571 flags.go:64] FLAG: --version="false" Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294895 2571 flags.go:64] FLAG: --vmodule="" Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294899 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.294902 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295045 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295050 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295053 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295056 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295059 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295061 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295064 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295067 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295069 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295072 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295075 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295079 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295082 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295085 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295087 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295090 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:35:58.301529 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295093 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295095 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295099 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295103 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295107 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295111 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295114 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295117 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295120 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295123 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295125 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295128 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295130 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295133 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295135 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295138 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295140 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295143 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295149 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:35:58.302070 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295152 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295155 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295157 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295160 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295162 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295165 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295167 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295170 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295174 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295176 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295180 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295182 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295185 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295188 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295190 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295192 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295195 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295197 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295200 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:35:58.302562 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295202 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295205 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295208 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295210 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295213 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295215 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295218 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295221 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295223 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295226 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295229 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295231 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295235 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295238 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295241 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295243 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295246 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295248 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295251 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295253 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:35:58.303071 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295256 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295264 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295267 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295270 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295273 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295275 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295278 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295280 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295283 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295285 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295288 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.295290 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:35:58.303567 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.296188 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:35:58.304351 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.304329 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:35:58.304386 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.304352 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:35:58.304424 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304415 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:35:58.304424 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304423 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304427 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304431 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304434 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304437 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304440 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304443 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304446 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304448 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304452 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304454 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304457 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304460 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304462 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304467 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304471 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304475 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304477 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:35:58.304477 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304484 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304488 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304490 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304493 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304496 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304498 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304501 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304503 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304506 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304509 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304511 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304514 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304517 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304520 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304522 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304525 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304527 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304530 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304532 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304534 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:35:58.304941 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304537 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304539 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304542 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304544 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304546 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304549 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304551 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304554 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304556 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304559 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304563 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304567 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304569 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304572 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304575 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304578 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304581 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304583 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304586 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:35:58.305439 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304588 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304591 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304594 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304597 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304599 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304602 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304605 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304608 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304610 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304613 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304615 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304617 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304620 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304623 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304626 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304628 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304630 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304633 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304636 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304638 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:35:58.305917 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304640 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304643 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304646 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304648 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304651 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304653 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304656 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304659 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.304665 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304782 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304788 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304791 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304794 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304797 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304800 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304803 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:35:58.306412 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304805 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304808 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304810 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304815 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304818 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304821 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304823 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304826 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304829 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304831 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304834 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304836 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304839 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304842 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304845 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304848 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304850 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304853 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304855 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304858 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:35:58.306799 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304860 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304863 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304881 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304885 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304888 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304891 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304894 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304896 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304899 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304903 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304906 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304909 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304911 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304914 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304916 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304919 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304922 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304925 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304927 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304930 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:35:58.307314 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304933 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304935 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304938 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304940 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304944 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304946 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304949 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304952 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304955 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304958 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304960 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304963 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304965 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304968 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304970 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304973 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304976 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304978 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304981 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304984 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:35:58.307789 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304987 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304990 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304994 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.304997 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305001 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305005 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305008 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305010 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305013 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305016 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305019 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305021 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305024 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305026 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305029 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305031 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305034 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305037 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:35:58.308354 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:58.305039 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:35:58.308798 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.305044 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:35:58.308798 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.305785 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:35:58.308798 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.307944 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:35:58.309049 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.309036 2571 server.go:1019] "Starting client certificate rotation" Apr 22 18:35:58.309164 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.309137 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:35:58.309223 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.309196 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:35:58.334782 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.334755 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:35:58.339287 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.339228 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:35:58.358739 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.358714 2571 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:35:58.365593 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.365537 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:35:58.366299 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.366281 2571 log.go:25] "Validated CRI v1 image API" Apr 22 18:35:58.367615 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.367599 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:35:58.371858 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.371837 2571 fs.go:135] Filesystem UUIDs: map[247e8a86-c67e-43b5-b732-7176da36547b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 86f6082a-cfdd-44a2-87fa-2490dbc74b6d:/dev/nvme0n1p3] Apr 22 18:35:58.371948 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.371858 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:35:58.378391 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.378270 2571 manager.go:217] Machine: {Timestamp:2026-04-22 18:35:58.376157528 +0000 UTC m=+0.442370599 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100726 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2203ecb2428eb424291e0fba24718c SystemUUID:ec2203ec-b242-8eb4-2429-1e0fba24718c BootID:dd02d5d9-b32d-4058-935d-c14fa9a75bb6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:26:9b:9e:97:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:26:9b:9e:97:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:31:ad:83:82:c1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:35:58.378391 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.378384 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:35:58.378527 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.378515 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:35:58.379705 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.379675 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:35:58.379863 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.379709 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-126.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:35:58.379922 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.379886 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:35:58.379922 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.379896 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:35:58.379922 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.379917 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:35:58.381093 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.381081 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:35:58.382471 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.382460 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:35:58.382792 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.382781 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:35:58.385419 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.385406 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:35:58.385468 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.385429 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:35:58.385468 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.385442 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:35:58.385468 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.385452 2571 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:35:58.385468 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.385462 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:35:58.386603 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.386590 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:35:58.386672 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.386610 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:35:58.390560 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.390542 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:35:58.392065 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.392048 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:35:58.394550 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394525 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:35:58.394616 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394559 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:35:58.394616 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394573 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:35:58.394616 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394585 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:35:58.394616 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394597 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:35:58.394616 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394609 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:35:58.394759 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394620 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:35:58.394759 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394634 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:35:58.394759 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394648 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:35:58.394759 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394660 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:35:58.394759 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394688 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:35:58.394759 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.394707 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:35:58.395772 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.395761 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:35:58.395808 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.395774 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:35:58.399548 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.399533 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:35:58.399621 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.399572 2571 server.go:1295] "Started kubelet" Apr 22 18:35:58.400837 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.400697 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-126.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:35:58.400940 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.400729 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:35:58.400940 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.400848 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-126.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:35:58.400940 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.399662 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:35:58.401073 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.400885 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:35:58.401245 ip-10-0-134-126 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:35:58.401378 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.400770 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:35:58.403576 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.402667 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:35:58.403576 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.403102 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:35:58.409105 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.408065 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-126.ec2.internal.18a8c1a2e6d9a145 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-126.ec2.internal,UID:ip-10-0-134-126.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-126.ec2.internal,},FirstTimestamp:2026-04-22 18:35:58.399545669 +0000 UTC m=+0.465758729,LastTimestamp:2026-04-22 18:35:58.399545669 +0000 UTC m=+0.465758729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-126.ec2.internal,}" Apr 22 18:35:58.409214 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.409129 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:35:58.409214 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.409132 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:35:58.409858 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.409839 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:35:58.409948 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.409863 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:35:58.409997 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.409985 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:35:58.410061 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410049 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:35:58.410113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410062 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:35:58.410222 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.410205 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:58.410323 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.410216 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:35:58.410758 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410739 2571 factory.go:153] Registering CRI-O factory Apr 22 18:35:58.410839 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410815 2571 factory.go:223] Registration of the crio container factory successfully Apr 22 18:35:58.410949 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410937 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:35:58.410996 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410953 2571 factory.go:55] Registering systemd factory Apr 22 18:35:58.410996 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410962 2571 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:35:58.410996 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410980 2571 factory.go:103] Registering Raw factory Apr 22 18:35:58.410996 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.410990 2571 manager.go:1196] Started watching for new ooms in manager Apr 22 18:35:58.411338 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.411326 2571 manager.go:319] Starting recovery of all containers Apr 22 18:35:58.417075 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.417051 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-126.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:35:58.417156 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.417099 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:35:58.420939 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.420920 2571 manager.go:324] Recovery completed Apr 22 18:35:58.425990 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.425978 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:35:58.429860 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.429842 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:35:58.429950 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.429891 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:35:58.429950 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.429905 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:35:58.430384 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.430375 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:35:58.430429 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.430384 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:35:58.430429 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.430399 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:35:58.432915 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.432896 2571 policy_none.go:49] "None policy: Start" Apr 22 18:35:58.432991 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.432927 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:35:58.432991 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.432942 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:35:58.433049 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.432967 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-126.ec2.internal.18a8c1a2e8a82fdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-126.ec2.internal,UID:ip-10-0-134-126.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-126.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-126.ec2.internal,},FirstTimestamp:2026-04-22 18:35:58.429859805 +0000 UTC m=+0.496072864,LastTimestamp:2026-04-22 18:35:58.429859805 +0000 UTC m=+0.496072864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-126.ec2.internal,}" Apr 22 18:35:58.441902 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.441820 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-126.ec2.internal.18a8c1a2e8a8c935 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-126.ec2.internal,UID:ip-10-0-134-126.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-126.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-126.ec2.internal,},FirstTimestamp:2026-04-22 18:35:58.429899061 +0000 UTC m=+0.496112121,LastTimestamp:2026-04-22 18:35:58.429899061 +0000 UTC m=+0.496112121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-126.ec2.internal,}" Apr 22 18:35:58.453591 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.453493 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-126.ec2.internal.18a8c1a2e8a8f233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-126.ec2.internal,UID:ip-10-0-134-126.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-126.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-126.ec2.internal,},FirstTimestamp:2026-04-22 18:35:58.429909555 +0000 UTC m=+0.496122616,LastTimestamp:2026-04-22 18:35:58.429909555 +0000 UTC m=+0.496122616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-126.ec2.internal,}" Apr 22 18:35:58.471619 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.471595 2571 manager.go:341] "Starting Device Plugin manager" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.471699 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.471713 2571 server.go:85] "Starting device plugin registration server" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.472035 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.472047 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.472139 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.472227 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.472237 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.472780 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:35:58.479923 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.472820 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:58.485883 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.485776 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-126.ec2.internal.18a8c1a2eb4af481 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-126.ec2.internal,UID:ip-10-0-134-126.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-134-126.ec2.internal,},FirstTimestamp:2026-04-22 18:35:58.474081409 +0000 UTC m=+0.540294456,LastTimestamp:2026-04-22 18:35:58.474081409 +0000 UTC m=+0.540294456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-126.ec2.internal,}" Apr 22 18:35:58.506943 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.506904 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:35:58.508139 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.508123 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:35:58.508241 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.508150 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:35:58.508241 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.508173 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:35:58.508241 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.508182 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:35:58.508373 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.508244 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:35:58.518307 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.518281 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 22 18:35:58.522150 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.522132 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-plctj" Apr 22 18:35:58.533222 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.533198 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-plctj" Apr 22 18:35:58.573440 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.573361 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:35:58.574749 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.574730 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:35:58.574855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.574759 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:35:58.574855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.574770 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:35:58.574855 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.574796 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.594008 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.593980 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.594008 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.594008 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-126.ec2.internal\": node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:58.605328 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.605302 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:58.608315 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.608293 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal"] Apr 22 18:35:58.608374 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.608365 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:35:58.609981 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.609962 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:35:58.610103 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.609991 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:35:58.610103 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.610001 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:35:58.611126 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.611107 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b07c8e88c54ca5836b8287342f637e6f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal\" (UID: \"b07c8e88c54ca5836b8287342f637e6f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.611196 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.611140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b07c8e88c54ca5836b8287342f637e6f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal\" (UID: \"b07c8e88c54ca5836b8287342f637e6f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.611336 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.611320 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:35:58.611488 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.611473 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.611545 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.611502 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:35:58.612107 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.612090 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:35:58.612202 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.612109 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:35:58.612202 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.612120 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:35:58.612202 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.612132 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:35:58.612202 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.612135 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:35:58.612202 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.612147 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:35:58.613196 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.613184 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.613257 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.613205 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:35:58.613918 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.613905 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:35:58.613984 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.613931 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:35:58.613984 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.613943 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:35:58.640677 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.640653 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-126.ec2.internal\" not found" node="ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.645173 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.645151 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-126.ec2.internal\" not found" node="ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.706182 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.706150 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:58.711341 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.711315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b07c8e88c54ca5836b8287342f637e6f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal\" (UID: \"b07c8e88c54ca5836b8287342f637e6f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.711458 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.711346 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b07c8e88c54ca5836b8287342f637e6f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal\" (UID: \"b07c8e88c54ca5836b8287342f637e6f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.711458 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.711369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21199ccb1140108a6b341eb7b765d477-config\") pod \"kube-apiserver-proxy-ip-10-0-134-126.ec2.internal\" (UID: \"21199ccb1140108a6b341eb7b765d477\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.711458 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.711410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b07c8e88c54ca5836b8287342f637e6f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal\" (UID: \"b07c8e88c54ca5836b8287342f637e6f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.711458 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.711426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b07c8e88c54ca5836b8287342f637e6f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal\" (UID: \"b07c8e88c54ca5836b8287342f637e6f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.807164 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.807112 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:58.812534 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.812497 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21199ccb1140108a6b341eb7b765d477-config\") pod \"kube-apiserver-proxy-ip-10-0-134-126.ec2.internal\" (UID: \"21199ccb1140108a6b341eb7b765d477\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.812598 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.812523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/21199ccb1140108a6b341eb7b765d477-config\") pod \"kube-apiserver-proxy-ip-10-0-134-126.ec2.internal\" (UID: \"21199ccb1140108a6b341eb7b765d477\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.908015 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:58.907940 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:58.942420 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.942394 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:58.947955 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:58.947933 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" Apr 22 18:35:59.009071 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.009030 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.109535 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.109496 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.210034 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.209946 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.308358 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.308332 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:35:59.310502 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.310482 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.409789 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.409756 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:35:59.410980 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.410960 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.420088 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.420064 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:35:59.447292 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.447260 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9vtlc" Apr 22 18:35:59.455483 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.455450 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9vtlc" Apr 22 18:35:59.511076 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.511052 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.535815 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.535745 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:30:58 +0000 UTC" deadline="2027-11-19 01:54:02.87543954 +0000 UTC" Apr 22 18:35:59.535815 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.535789 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13807h18m3.339655771s" Apr 22 18:35:59.574288 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:59.574242 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb07c8e88c54ca5836b8287342f637e6f.slice/crio-2477d3009843e0b0b9a7a5d1c82f2b7c1bc19990e5b2326fe477841de033623b WatchSource:0}: Error finding container 2477d3009843e0b0b9a7a5d1c82f2b7c1bc19990e5b2326fe477841de033623b: Status 404 returned error can't find the container with id 2477d3009843e0b0b9a7a5d1c82f2b7c1bc19990e5b2326fe477841de033623b Apr 22 18:35:59.574482 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:35:59.574454 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21199ccb1140108a6b341eb7b765d477.slice/crio-0539d91c981a8c3629d3075cfb898b6ac2e13603f2cd8798255850e53cb7fdcd WatchSource:0}: Error finding container 0539d91c981a8c3629d3075cfb898b6ac2e13603f2cd8798255850e53cb7fdcd: Status 404 returned error can't find the container with id 0539d91c981a8c3629d3075cfb898b6ac2e13603f2cd8798255850e53cb7fdcd Apr 22 18:35:59.578185 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.578169 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:35:59.611195 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.611143 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.611402 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.611383 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:35:59.712071 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.712038 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.812650 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:35:59.812568 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-126.ec2.internal\" not found" Apr 22 18:35:59.816589 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.816564 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:35:59.818414 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.818396 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:35:59.909837 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.909805 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" Apr 22 18:35:59.926024 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.925991 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:35:59.926178 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.926112 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" Apr 22 18:35:59.936724 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:35:59.936704 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:00.010686 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.010652 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:00.386452 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.386419 2571 apiserver.go:52] "Watching apiserver" Apr 22 18:36:00.395435 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.395408 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:36:00.395899 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.395830 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-47tpt","openshift-multus/multus-additional-cni-plugins-d5f7s","openshift-multus/network-metrics-daemon-8llv8","openshift-network-diagnostics/network-check-target-sd9g2","openshift-network-operator/iptables-alerter-xwhrw","openshift-cluster-node-tuning-operator/tuned-bwbj7","openshift-image-registry/node-ca-2df55","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal","openshift-multus/multus-ql2h8","openshift-ovn-kubernetes/ovnkube-node-ncvvr","kube-system/konnectivity-agent-llq6g","kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz"] Apr 22 18:36:00.399073 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.398610 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.401036 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.400679 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:00.401036 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:00.400747 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:00.403012 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.401953 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.403012 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.402515 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jcfwz\"" Apr 22 18:36:00.403012 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.402853 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:36:00.403012 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.402913 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:36:00.403264 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.403033 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:36:00.403264 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.403223 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:36:00.403462 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.403435 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:36:00.404957 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.404270 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.406656 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.405778 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:36:00.406656 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.405994 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-789rh\"" Apr 22 18:36:00.406656 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.406211 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:36:00.406656 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.406414 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:36:00.406656 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.406652 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:00.406949 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.406743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:00.406949 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.406855 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-57dw5\"" Apr 22 18:36:00.407904 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.407884 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:00.407996 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:00.407959 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:00.409749 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.409607 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.412034 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.411941 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.415885 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.415854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.416766 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.416745 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:36:00.416939 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.416914 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6gsp2\"" Apr 22 18:36:00.417161 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.417125 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:00.417257 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.417173 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:36:00.417257 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.417194 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:36:00.417416 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.417375 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:36:00.417474 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.417417 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-znz9g\"" Apr 22 18:36:00.419573 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.418888 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.419573 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.419385 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.419573 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.419532 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:36:00.419751 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.419395 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:36:00.421063 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.420233 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:36:00.421063 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.420270 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:36:00.421063 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.420562 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8k8nq\"" Apr 22 18:36:00.421063 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.420649 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:36:00.421576 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.421508 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-os-release\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.421576 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.421559 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-systemd-units\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.421714 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.421597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-slash\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.421714 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.421628 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-run-netns\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.422412 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422392 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:36:00.422673 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-log-socket\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.422738 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:00.422738 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-netns\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.422738 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422712 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-hostroot\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.422738 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-multus-certs\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.422954 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-ovn\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.422954 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:00.422954 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422825 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-modprobe-d\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.422954 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-system-cni-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.422954 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/539b75c3-f4eb-4f78-bdd1-438056001519-cni-binary-copy\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.422954 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422906 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:36:00.422954 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-cni-multus\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.422954 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-env-overrides\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422958 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l595c\" (UniqueName: \"kubernetes.io/projected/bcfd0f0f-fa19-459b-9541-ffe992fad530-kube-api-access-l595c\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.422986 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/539b75c3-f4eb-4f78-bdd1-438056001519-multus-daemon-config\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423019 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-hosts-file\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-tmp-dir\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423083 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20357bd7-0dd6-4792-92d1-d5705073a86e-host\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423106 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/20357bd7-0dd6-4792-92d1-d5705073a86e-serviceca\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-cnibin\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-k8s-cni-cncf-io\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423155 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423171 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwx7c\" (UniqueName: \"kubernetes.io/projected/20357bd7-0dd6-4792-92d1-d5705073a86e-kube-api-access-fwx7c\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423195 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423202 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423209 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysctl-conf\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423227 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-cnibin\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423245 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423260 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cdp5\" (UniqueName: \"kubernetes.io/projected/578f7fc7-df68-49e8-ab1c-e8782370ea85-kube-api-access-8cdp5\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.423323 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423274 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-system-cni-dir\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysctl-d\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423312 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-etc-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423326 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-cni-netd\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-run\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423382 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-host\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423386 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423402 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-var-lib-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-run-ovn-kubernetes\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423433 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovn-node-metrics-cert\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovnkube-script-lib\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423465 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-lib-modules\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423484 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-kubelet\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423519 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gjzf\" (UniqueName: \"kubernetes.io/projected/539b75c3-f4eb-4f78-bdd1-438056001519-kube-api-access-9gjzf\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423560 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-cni-bin\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.424091 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423588 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ptg64\"" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423596 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-cni-bin\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysconfig\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423638 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-sys\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-cni-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423688 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-systemd\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmmh\" (UniqueName: \"kubernetes.io/projected/b1680f68-7360-441e-8836-c911ac062e82-kube-api-access-9xmmh\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423760 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423902 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.423985 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcfd0f0f-fa19-459b-9541-ffe992fad530-tmp\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424016 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424019 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-socket-dir-parent\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-node-log\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424137 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-os-release\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424160 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lx5wx\"" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-systemd\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-var-lib-kubelet\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424212 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-tuned\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.424821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424238 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-conf-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.425582 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424263 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-kubelet\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.425582 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovnkube-config\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.425582 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424331 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hbmj6\"" Apr 22 18:36:00.425582 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424337 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:36:00.425582 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424333 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgvkl\" (UniqueName: \"kubernetes.io/projected/7ff729b2-3f8f-4665-ae12-c83c4b179998-kube-api-access-jgvkl\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:00.425582 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424371 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-etc-kubernetes\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.425582 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424401 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txt4\" (UniqueName: \"kubernetes.io/projected/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-kube-api-access-4txt4\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.425582 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.424448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-kubernetes\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.457844 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.457800 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:30:59 +0000 UTC" deadline="2027-10-01 07:25:34.693339723 +0000 UTC" Apr 22 18:36:00.457844 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.457843 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12636h49m34.235500519s" Apr 22 18:36:00.512104 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.511852 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:36:00.513409 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.513360 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" event={"ID":"b07c8e88c54ca5836b8287342f637e6f","Type":"ContainerStarted","Data":"2477d3009843e0b0b9a7a5d1c82f2b7c1bc19990e5b2326fe477841de033623b"} Apr 22 18:36:00.514550 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.514521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" event={"ID":"21199ccb1140108a6b341eb7b765d477","Type":"ContainerStarted","Data":"0539d91c981a8c3629d3075cfb898b6ac2e13603f2cd8798255850e53cb7fdcd"} Apr 22 18:36:00.524842 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.524813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-tuned\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.524997 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.524901 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-conf-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.524997 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.524933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-kubelet\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.524997 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.524972 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovnkube-config\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.525136 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.524997 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgvkl\" (UniqueName: \"kubernetes.io/projected/7ff729b2-3f8f-4665-ae12-c83c4b179998-kube-api-access-jgvkl\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:00.525136 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525027 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-etc-kubernetes\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.525136 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4txt4\" (UniqueName: \"kubernetes.io/projected/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-kube-api-access-4txt4\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.525136 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525086 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-kubernetes\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.525136 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525122 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/481b4ac3-0a49-4659-8d45-8a350c162d28-host-slash\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.525289 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525150 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-device-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.525289 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-etc-selinux\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.525289 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-os-release\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.525289 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-systemd-units\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.525397 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-slash\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.525397 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-run-netns\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.525397 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525363 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-log-socket\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.525481 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:00.525481 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525427 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-netns\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.525481 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525460 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-hostroot\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.525579 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525485 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-multus-certs\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.525579 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-ovn\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.525579 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:00.525579 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-modprobe-d\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.525698 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-system-cni-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.525698 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/539b75c3-f4eb-4f78-bdd1-438056001519-cni-binary-copy\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.525698 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-cni-multus\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.525794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-env-overrides\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.525794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525761 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78qxp\" (UniqueName: \"kubernetes.io/projected/481b4ac3-0a49-4659-8d45-8a350c162d28-kube-api-access-78qxp\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.525854 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l595c\" (UniqueName: \"kubernetes.io/projected/bcfd0f0f-fa19-459b-9541-ffe992fad530-kube-api-access-l595c\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.525854 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dcc5a374-518f-4fe9-b03c-6c43c581735c-agent-certs\") pod \"konnectivity-agent-llq6g\" (UID: \"dcc5a374-518f-4fe9-b03c-6c43c581735c\") " pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:00.525854 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/481b4ac3-0a49-4659-8d45-8a350c162d28-iptables-alerter-script\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.525959 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-registration-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.525959 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525928 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/539b75c3-f4eb-4f78-bdd1-438056001519-multus-daemon-config\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.526033 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-hosts-file\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.526033 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.525983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-tmp-dir\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.526033 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20357bd7-0dd6-4792-92d1-d5705073a86e-host\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.526141 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/20357bd7-0dd6-4792-92d1-d5705073a86e-serviceca\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.526141 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-cnibin\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.526232 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-k8s-cni-cncf-io\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.526295 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-hostroot\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.526295 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-ovn\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.526493 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526466 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-os-release\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.526577 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526527 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-systemd-units\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.526666 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-slash\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.526666 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526626 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-run-netns\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.526737 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526673 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-log-socket\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.526791 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:00.526775 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:00.526845 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.526925 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/20357bd7-0dd6-4792-92d1-d5705073a86e-host\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.526925 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:00.526922 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:01.026898585 +0000 UTC m=+3.093111655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:00.527052 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.526934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-cnibin\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.527107 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527091 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-netns\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.527394 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527369 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-tmp-dir\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.527466 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527376 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-modprobe-d\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.527466 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527412 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-conf-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.527466 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-k8s-cni-cncf-io\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.527466 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527450 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/539b75c3-f4eb-4f78-bdd1-438056001519-multus-daemon-config\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.527638 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-sys-fs\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.527638 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwx7c\" (UniqueName: \"kubernetes.io/projected/20357bd7-0dd6-4792-92d1-d5705073a86e-kube-api-access-fwx7c\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.527638 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-hosts-file\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.527638 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.527638 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-system-cni-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.527638 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527090 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.527903 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysctl-conf\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.527903 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-cnibin\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.527903 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.527903 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cdp5\" (UniqueName: \"kubernetes.io/projected/578f7fc7-df68-49e8-ab1c-e8782370ea85-kube-api-access-8cdp5\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.527903 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-system-cni-dir\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.528095 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysctl-d\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.528095 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527941 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-etc-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528095 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527973 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-cnibin\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.528095 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.527980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-cni-netd\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528095 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528003 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysctl-conf\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.528095 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528010 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.528095 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-run\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.528353 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528104 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-run-multus-certs\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.528353 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-etc-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528353 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysctl-d\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.528470 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528470 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-system-cni-dir\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.528543 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528468 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-cni-netd\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528543 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-env-overrides\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528543 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-host\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.528635 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528548 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-cni-multus\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.528635 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-var-lib-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528635 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528554 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-host\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.528635 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528587 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-run\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.528766 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-run-ovn-kubernetes\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528766 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovn-node-metrics-cert\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528766 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovnkube-script-lib\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.528766 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-lib-modules\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.528885 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-kubelet\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.528885 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528831 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gjzf\" (UniqueName: \"kubernetes.io/projected/539b75c3-f4eb-4f78-bdd1-438056001519-kube-api-access-9gjzf\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.528964 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.528887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.529023 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-cni-bin\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.529060 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529046 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-cni-bin\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.529092 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-cni-binary-copy\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.529092 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovnkube-config\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.529092 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysconfig\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.529184 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529124 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-kubelet\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.529184 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529157 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-sysconfig\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.529242 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529189 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-kubelet\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.529242 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529209 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.529242 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-lib-modules\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.529333 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-etc-kubernetes\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.529333 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-kubernetes\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.529333 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-host-var-lib-cni-bin\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.529447 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-run-ovn-kubernetes\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.529447 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529370 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-host-cni-bin\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.529447 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529437 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:36:00.529548 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-sys\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.529963 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.529776 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dcc5a374-518f-4fe9-b03c-6c43c581735c-konnectivity-ca\") pod \"konnectivity-agent-llq6g\" (UID: \"dcc5a374-518f-4fe9-b03c-6c43c581735c\") " pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:00.530146 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-var-lib-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.530250 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1680f68-7360-441e-8836-c911ac062e82-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.530342 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/20357bd7-0dd6-4792-92d1-d5705073a86e-serviceca\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.530394 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530346 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.530448 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-sys\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.530448 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530416 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-socket-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.530546 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530447 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-cni-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.530546 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-systemd\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.530546 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmmh\" (UniqueName: \"kubernetes.io/projected/b1680f68-7360-441e-8836-c911ac062e82-kube-api-access-9xmmh\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.530546 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/539b75c3-f4eb-4f78-bdd1-438056001519-cni-binary-copy\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.530546 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcfd0f0f-fa19-459b-9541-ffe992fad530-tmp\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.530762 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530568 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7qzp\" (UniqueName: \"kubernetes.io/projected/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-kube-api-access-m7qzp\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.530762 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530633 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-cni-dir\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.530762 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530657 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovnkube-script-lib\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.530762 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530686 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-systemd\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.530762 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530717 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-socket-dir-parent\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.531036 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.531036 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/539b75c3-f4eb-4f78-bdd1-438056001519-multus-socket-dir-parent\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.531036 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530946 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-node-log\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.531036 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.530962 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-run-openvswitch\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.531036 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.531011 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-os-release\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.531304 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.531046 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-systemd\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.531304 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.531089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-var-lib-kubelet\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.531304 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.531127 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1680f68-7360-441e-8836-c911ac062e82-os-release\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.531304 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.531135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/578f7fc7-df68-49e8-ab1c-e8782370ea85-node-log\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.531304 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.531178 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-var-lib-kubelet\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.531304 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.531183 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-systemd\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.533348 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.533266 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bcfd0f0f-fa19-459b-9541-ffe992fad530-etc-tuned\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.533348 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.533326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/578f7fc7-df68-49e8-ab1c-e8782370ea85-ovn-node-metrics-cert\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.534000 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.533980 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bcfd0f0f-fa19-459b-9541-ffe992fad530-tmp\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.537080 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:00.537058 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:00.537180 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:00.537085 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:00.537180 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:00.537099 2571 projected.go:194] Error preparing data for projected volume kube-api-access-9x524 for pod openshift-network-diagnostics/network-check-target-sd9g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:00.537180 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:00.537164 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524 podName:743f27e4-2f91-43c9-a360-363424f5563c nodeName:}" failed. No retries permitted until 2026-04-22 18:36:01.037145595 +0000 UTC m=+3.103358645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9x524" (UniqueName: "kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524") pod "network-check-target-sd9g2" (UID: "743f27e4-2f91-43c9-a360-363424f5563c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:00.542162 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.542135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cdp5\" (UniqueName: \"kubernetes.io/projected/578f7fc7-df68-49e8-ab1c-e8782370ea85-kube-api-access-8cdp5\") pod \"ovnkube-node-ncvvr\" (UID: \"578f7fc7-df68-49e8-ab1c-e8782370ea85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.544015 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.543992 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmmh\" (UniqueName: \"kubernetes.io/projected/b1680f68-7360-441e-8836-c911ac062e82-kube-api-access-9xmmh\") pod \"multus-additional-cni-plugins-d5f7s\" (UID: \"b1680f68-7360-441e-8836-c911ac062e82\") " pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.544421 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.544402 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l595c\" (UniqueName: \"kubernetes.io/projected/bcfd0f0f-fa19-459b-9541-ffe992fad530-kube-api-access-l595c\") pod \"tuned-bwbj7\" (UID: \"bcfd0f0f-fa19-459b-9541-ffe992fad530\") " pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.544547 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.544506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwx7c\" (UniqueName: \"kubernetes.io/projected/20357bd7-0dd6-4792-92d1-d5705073a86e-kube-api-access-fwx7c\") pod \"node-ca-2df55\" (UID: \"20357bd7-0dd6-4792-92d1-d5705073a86e\") " pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.544794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.544776 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gjzf\" (UniqueName: \"kubernetes.io/projected/539b75c3-f4eb-4f78-bdd1-438056001519-kube-api-access-9gjzf\") pod \"multus-ql2h8\" (UID: \"539b75c3-f4eb-4f78-bdd1-438056001519\") " pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.544931 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.544911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txt4\" (UniqueName: \"kubernetes.io/projected/2bd7e4a3-0f74-4ca3-848f-f713afd48c22-kube-api-access-4txt4\") pod \"node-resolver-47tpt\" (UID: \"2bd7e4a3-0f74-4ca3-848f-f713afd48c22\") " pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.546693 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.546669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgvkl\" (UniqueName: \"kubernetes.io/projected/7ff729b2-3f8f-4665-ae12-c83c4b179998-kube-api-access-jgvkl\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:00.631451 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631421 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-etc-selinux\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.631671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631490 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78qxp\" (UniqueName: \"kubernetes.io/projected/481b4ac3-0a49-4659-8d45-8a350c162d28-kube-api-access-78qxp\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.631671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631514 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dcc5a374-518f-4fe9-b03c-6c43c581735c-agent-certs\") pod \"konnectivity-agent-llq6g\" (UID: \"dcc5a374-518f-4fe9-b03c-6c43c581735c\") " pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:00.631671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/481b4ac3-0a49-4659-8d45-8a350c162d28-iptables-alerter-script\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.631671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631549 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-etc-selinux\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.631671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-registration-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.631671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-sys-fs\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.631671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dcc5a374-518f-4fe9-b03c-6c43c581735c-konnectivity-ca\") pod \"konnectivity-agent-llq6g\" (UID: \"dcc5a374-518f-4fe9-b03c-6c43c581735c\") " pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:00.631671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631668 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-socket-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631722 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7qzp\" (UniqueName: \"kubernetes.io/projected/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-kube-api-access-m7qzp\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/481b4ac3-0a49-4659-8d45-8a350c162d28-host-slash\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631779 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-device-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-device-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631935 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-registration-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.631970 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.632053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-socket-dir\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.632097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/481b4ac3-0a49-4659-8d45-8a350c162d28-host-slash\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.632552 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.632147 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-sys-fs\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.632552 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.632455 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/481b4ac3-0a49-4659-8d45-8a350c162d28-iptables-alerter-script\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.632654 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.632625 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dcc5a374-518f-4fe9-b03c-6c43c581735c-konnectivity-ca\") pod \"konnectivity-agent-llq6g\" (UID: \"dcc5a374-518f-4fe9-b03c-6c43c581735c\") " pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:00.634189 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.634166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dcc5a374-518f-4fe9-b03c-6c43c581735c-agent-certs\") pod \"konnectivity-agent-llq6g\" (UID: \"dcc5a374-518f-4fe9-b03c-6c43c581735c\") " pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:00.640058 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.640002 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7qzp\" (UniqueName: \"kubernetes.io/projected/5ee5dbab-f80b-4ccd-97c3-0fca5299a371-kube-api-access-m7qzp\") pod \"aws-ebs-csi-driver-node-hcdjz\" (UID: \"5ee5dbab-f80b-4ccd-97c3-0fca5299a371\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.640162 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.640079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78qxp\" (UniqueName: \"kubernetes.io/projected/481b4ac3-0a49-4659-8d45-8a350c162d28-kube-api-access-78qxp\") pod \"iptables-alerter-xwhrw\" (UID: \"481b4ac3-0a49-4659-8d45-8a350c162d28\") " pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:00.715153 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.715112 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" Apr 22 18:36:00.724985 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.724951 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2df55" Apr 22 18:36:00.732895 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.732717 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" Apr 22 18:36:00.738773 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.738748 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ql2h8" Apr 22 18:36:00.745493 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.745458 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:00.752123 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.752098 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-47tpt" Apr 22 18:36:00.764797 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.764767 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:00.770534 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.770510 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" Apr 22 18:36:00.777250 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:00.777229 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xwhrw" Apr 22 18:36:01.034440 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.034321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:01.034589 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:01.034477 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:01.034589 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:01.034566 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:02.034545166 +0000 UTC m=+4.100758225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:01.134918 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.134862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:01.135087 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:01.134995 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:01.135087 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:01.135026 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:01.135087 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:01.135039 2571 projected.go:194] Error preparing data for projected volume kube-api-access-9x524 for pod openshift-network-diagnostics/network-check-target-sd9g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:01.135190 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:01.135096 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524 podName:743f27e4-2f91-43c9-a360-363424f5563c nodeName:}" failed. No retries permitted until 2026-04-22 18:36:02.135082336 +0000 UTC m=+4.201295389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9x524" (UniqueName: "kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524") pod "network-check-target-sd9g2" (UID: "743f27e4-2f91-43c9-a360-363424f5563c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:01.299125 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.299085 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1680f68_7360_441e_8836_c911ac062e82.slice/crio-63b36d0cb4af7b9afc59c096ba2ef6b6f3998c5c68b0124ad3893f6c6dbc45d2 WatchSource:0}: Error finding container 63b36d0cb4af7b9afc59c096ba2ef6b6f3998c5c68b0124ad3893f6c6dbc45d2: Status 404 returned error can't find the container with id 63b36d0cb4af7b9afc59c096ba2ef6b6f3998c5c68b0124ad3893f6c6dbc45d2 Apr 22 18:36:01.303889 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.303848 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod539b75c3_f4eb_4f78_bdd1_438056001519.slice/crio-9a3f08e45d9a375a350b1486031d8f0235b9388dd002fcf5a0692a17a2931023 WatchSource:0}: Error finding container 9a3f08e45d9a375a350b1486031d8f0235b9388dd002fcf5a0692a17a2931023: Status 404 returned error can't find the container with id 9a3f08e45d9a375a350b1486031d8f0235b9388dd002fcf5a0692a17a2931023 Apr 22 18:36:01.304474 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.304446 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bd7e4a3_0f74_4ca3_848f_f713afd48c22.slice/crio-8015512d1822df8ad036fd99a8c89227baaf61f21d761083fb42aa61b017aa6c WatchSource:0}: Error finding container 8015512d1822df8ad036fd99a8c89227baaf61f21d761083fb42aa61b017aa6c: Status 404 returned error can't find the container with id 8015512d1822df8ad036fd99a8c89227baaf61f21d761083fb42aa61b017aa6c Apr 22 18:36:01.305940 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.305914 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ee5dbab_f80b_4ccd_97c3_0fca5299a371.slice/crio-18a4e9775a4ce006076bd7d9f3ef18c23049b29a9824dfb47165b0c0eb476aef WatchSource:0}: Error finding container 18a4e9775a4ce006076bd7d9f3ef18c23049b29a9824dfb47165b0c0eb476aef: Status 404 returned error can't find the container with id 18a4e9775a4ce006076bd7d9f3ef18c23049b29a9824dfb47165b0c0eb476aef Apr 22 18:36:01.306304 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.306094 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod578f7fc7_df68_49e8_ab1c_e8782370ea85.slice/crio-2db7c18e8bee6b786d40037ae0d3b0f5fdf43b2e21510774962e6aeab5a42b23 WatchSource:0}: Error finding container 2db7c18e8bee6b786d40037ae0d3b0f5fdf43b2e21510774962e6aeab5a42b23: Status 404 returned error can't find the container with id 2db7c18e8bee6b786d40037ae0d3b0f5fdf43b2e21510774962e6aeab5a42b23 Apr 22 18:36:01.308317 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.307580 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod481b4ac3_0a49_4659_8d45_8a350c162d28.slice/crio-132c4e5bd04ce7bff4db4e33a04373211dac44df3e438a2f2e25c469bf64af3f WatchSource:0}: Error finding container 132c4e5bd04ce7bff4db4e33a04373211dac44df3e438a2f2e25c469bf64af3f: Status 404 returned error can't find the container with id 132c4e5bd04ce7bff4db4e33a04373211dac44df3e438a2f2e25c469bf64af3f Apr 22 18:36:01.308746 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.308715 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc5a374_518f_4fe9_b03c_6c43c581735c.slice/crio-f1650a6abe1972a79611f80de8c2354effb1056ec2b265476683596d0a7f6254 WatchSource:0}: Error finding container f1650a6abe1972a79611f80de8c2354effb1056ec2b265476683596d0a7f6254: Status 404 returned error can't find the container with id f1650a6abe1972a79611f80de8c2354effb1056ec2b265476683596d0a7f6254 Apr 22 18:36:01.309579 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.309537 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcfd0f0f_fa19_459b_9541_ffe992fad530.slice/crio-7a1149379df93bbb0b99e658d4c49067aeb79f0ae3cea66283a2bb9911d3a1b6 WatchSource:0}: Error finding container 7a1149379df93bbb0b99e658d4c49067aeb79f0ae3cea66283a2bb9911d3a1b6: Status 404 returned error can't find the container with id 7a1149379df93bbb0b99e658d4c49067aeb79f0ae3cea66283a2bb9911d3a1b6 Apr 22 18:36:01.310720 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:01.310701 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20357bd7_0dd6_4792_92d1_d5705073a86e.slice/crio-1f28317709628a46047cba2646a0cd0329254133c54c8f5c751967165cc12768 WatchSource:0}: Error finding container 1f28317709628a46047cba2646a0cd0329254133c54c8f5c751967165cc12768: Status 404 returned error can't find the container with id 1f28317709628a46047cba2646a0cd0329254133c54c8f5c751967165cc12768 Apr 22 18:36:01.458762 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.458713 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:30:59 +0000 UTC" deadline="2027-12-21 06:33:07.253469394 +0000 UTC" Apr 22 18:36:01.458762 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.458758 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14579h57m5.794714999s" Apr 22 18:36:01.517816 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.517777 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" event={"ID":"21199ccb1140108a6b341eb7b765d477","Type":"ContainerStarted","Data":"67929030ad180e61dd4baf8b2a58f21927b72c2ed7568f8d7ab4ad1474c1756f"} Apr 22 18:36:01.518920 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.518896 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2df55" event={"ID":"20357bd7-0dd6-4792-92d1-d5705073a86e","Type":"ContainerStarted","Data":"1f28317709628a46047cba2646a0cd0329254133c54c8f5c751967165cc12768"} Apr 22 18:36:01.519793 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.519775 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" event={"ID":"bcfd0f0f-fa19-459b-9541-ffe992fad530","Type":"ContainerStarted","Data":"7a1149379df93bbb0b99e658d4c49067aeb79f0ae3cea66283a2bb9911d3a1b6"} Apr 22 18:36:01.520711 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.520690 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xwhrw" event={"ID":"481b4ac3-0a49-4659-8d45-8a350c162d28","Type":"ContainerStarted","Data":"132c4e5bd04ce7bff4db4e33a04373211dac44df3e438a2f2e25c469bf64af3f"} Apr 22 18:36:01.521668 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.521642 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ql2h8" event={"ID":"539b75c3-f4eb-4f78-bdd1-438056001519","Type":"ContainerStarted","Data":"9a3f08e45d9a375a350b1486031d8f0235b9388dd002fcf5a0692a17a2931023"} Apr 22 18:36:01.522558 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.522540 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-llq6g" event={"ID":"dcc5a374-518f-4fe9-b03c-6c43c581735c","Type":"ContainerStarted","Data":"f1650a6abe1972a79611f80de8c2354effb1056ec2b265476683596d0a7f6254"} Apr 22 18:36:01.523483 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.523460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"2db7c18e8bee6b786d40037ae0d3b0f5fdf43b2e21510774962e6aeab5a42b23"} Apr 22 18:36:01.524345 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.524314 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" event={"ID":"5ee5dbab-f80b-4ccd-97c3-0fca5299a371","Type":"ContainerStarted","Data":"18a4e9775a4ce006076bd7d9f3ef18c23049b29a9824dfb47165b0c0eb476aef"} Apr 22 18:36:01.525156 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.525140 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-47tpt" event={"ID":"2bd7e4a3-0f74-4ca3-848f-f713afd48c22","Type":"ContainerStarted","Data":"8015512d1822df8ad036fd99a8c89227baaf61f21d761083fb42aa61b017aa6c"} Apr 22 18:36:01.527691 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.527266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" event={"ID":"b1680f68-7360-441e-8836-c911ac062e82","Type":"ContainerStarted","Data":"63b36d0cb4af7b9afc59c096ba2ef6b6f3998c5c68b0124ad3893f6c6dbc45d2"} Apr 22 18:36:01.533283 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:01.533244 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-126.ec2.internal" podStartSLOduration=2.533233672 podStartE2EDuration="2.533233672s" podCreationTimestamp="2026-04-22 18:35:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:01.531948227 +0000 UTC m=+3.598161297" watchObservedRunningTime="2026-04-22 18:36:01.533233672 +0000 UTC m=+3.599446751" Apr 22 18:36:02.046707 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.046146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:02.046707 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.046311 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:02.046707 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.046384 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:04.046363521 +0000 UTC m=+6.112576583 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:02.147425 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.147385 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:02.147600 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.147558 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:02.147600 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.147577 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:02.147600 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.147589 2571 projected.go:194] Error preparing data for projected volume kube-api-access-9x524 for pod openshift-network-diagnostics/network-check-target-sd9g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:02.147762 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.147652 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524 podName:743f27e4-2f91-43c9-a360-363424f5563c nodeName:}" failed. No retries permitted until 2026-04-22 18:36:04.147632878 +0000 UTC m=+6.213845928 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9x524" (UniqueName: "kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524") pod "network-check-target-sd9g2" (UID: "743f27e4-2f91-43c9-a360-363424f5563c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:02.427132 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.426186 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hlw26"] Apr 22 18:36:02.429435 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.428973 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:02.429435 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.429050 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:02.509414 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.509378 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:02.509858 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.509535 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:02.510140 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.509995 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:02.510140 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.510097 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:02.550484 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.550449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6adad479-9e37-4d83-b878-e22fdf6dc3d9-kubelet-config\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:02.550653 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.550522 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6adad479-9e37-4d83-b878-e22fdf6dc3d9-dbus\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:02.550653 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.550590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:02.556601 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.556571 2571 generic.go:358] "Generic (PLEG): container finished" podID="b07c8e88c54ca5836b8287342f637e6f" containerID="40c45405b9335878231d060e1d2565b06a4e7f0659436615fa80dbe7470a7396" exitCode=0 Apr 22 18:36:02.556746 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.556683 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" event={"ID":"b07c8e88c54ca5836b8287342f637e6f","Type":"ContainerDied","Data":"40c45405b9335878231d060e1d2565b06a4e7f0659436615fa80dbe7470a7396"} Apr 22 18:36:02.651863 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.651085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6adad479-9e37-4d83-b878-e22fdf6dc3d9-kubelet-config\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:02.651863 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.651161 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6adad479-9e37-4d83-b878-e22fdf6dc3d9-dbus\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:02.651863 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.651228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:02.651863 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.651349 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:02.651863 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:02.651415 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret podName:6adad479-9e37-4d83-b878-e22fdf6dc3d9 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:03.151397016 +0000 UTC m=+5.217610069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret") pod "global-pull-secret-syncer-hlw26" (UID: "6adad479-9e37-4d83-b878-e22fdf6dc3d9") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:02.651863 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.651679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6adad479-9e37-4d83-b878-e22fdf6dc3d9-kubelet-config\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:02.651863 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:02.651819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6adad479-9e37-4d83-b878-e22fdf6dc3d9-dbus\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:03.156037 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:03.155950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:03.156191 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:03.156109 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:03.156191 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:03.156169 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret podName:6adad479-9e37-4d83-b878-e22fdf6dc3d9 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:04.156151168 +0000 UTC m=+6.222364221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret") pod "global-pull-secret-syncer-hlw26" (UID: "6adad479-9e37-4d83-b878-e22fdf6dc3d9") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:03.584664 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:03.584624 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" event={"ID":"b07c8e88c54ca5836b8287342f637e6f","Type":"ContainerStarted","Data":"d33642b3fd4ae61bfc8b15b917c76c36f4331184389c607d6eb274bda44db92a"} Apr 22 18:36:04.063061 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:04.062933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:04.063378 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.063090 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:04.063378 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.063162 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:08.06314383 +0000 UTC m=+10.129356886 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:04.165121 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:04.164295 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:04.165121 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:04.164350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:04.165121 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.164474 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:04.165121 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.164543 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret podName:6adad479-9e37-4d83-b878-e22fdf6dc3d9 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:06.164523725 +0000 UTC m=+8.230736792 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret") pod "global-pull-secret-syncer-hlw26" (UID: "6adad479-9e37-4d83-b878-e22fdf6dc3d9") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:04.165121 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.165010 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:04.165121 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.165031 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:04.165121 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.165044 2571 projected.go:194] Error preparing data for projected volume kube-api-access-9x524 for pod openshift-network-diagnostics/network-check-target-sd9g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:04.165121 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.165133 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524 podName:743f27e4-2f91-43c9-a360-363424f5563c nodeName:}" failed. No retries permitted until 2026-04-22 18:36:08.165116797 +0000 UTC m=+10.231329845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9x524" (UniqueName: "kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524") pod "network-check-target-sd9g2" (UID: "743f27e4-2f91-43c9-a360-363424f5563c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:04.509116 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:04.508887 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:04.509116 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:04.508915 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:04.509116 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.509056 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:04.509854 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.509572 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:04.509854 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:04.509627 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:04.509854 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:04.509713 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:06.181885 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:06.181517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:06.181885 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:06.181706 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:06.181885 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:06.181780 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret podName:6adad479-9e37-4d83-b878-e22fdf6dc3d9 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:10.181758687 +0000 UTC m=+12.247971743 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret") pod "global-pull-secret-syncer-hlw26" (UID: "6adad479-9e37-4d83-b878-e22fdf6dc3d9") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:06.510251 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:06.509414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:06.510251 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:06.509559 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:06.510251 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:06.510013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:06.510251 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:06.510133 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:06.510773 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:06.510643 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:06.510773 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:06.510737 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:08.098999 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:08.098958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:08.099486 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.099124 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:08.099486 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.099205 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:16.099187182 +0000 UTC m=+18.165400233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:08.200058 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:08.200015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:08.200262 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.200241 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:08.200323 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.200269 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:08.200323 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.200283 2571 projected.go:194] Error preparing data for projected volume kube-api-access-9x524 for pod openshift-network-diagnostics/network-check-target-sd9g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:08.200433 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.200347 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524 podName:743f27e4-2f91-43c9-a360-363424f5563c nodeName:}" failed. No retries permitted until 2026-04-22 18:36:16.200328884 +0000 UTC m=+18.266541933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9x524" (UniqueName: "kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524") pod "network-check-target-sd9g2" (UID: "743f27e4-2f91-43c9-a360-363424f5563c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:08.510127 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:08.510014 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:08.510298 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.510146 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:08.510298 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:08.510213 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:08.510298 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.510289 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:08.510662 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:08.510508 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:08.510662 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:08.510609 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:10.215741 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:10.215705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:10.216177 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:10.215861 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:10.216177 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:10.215943 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret podName:6adad479-9e37-4d83-b878-e22fdf6dc3d9 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:18.2159256 +0000 UTC m=+20.282138656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret") pod "global-pull-secret-syncer-hlw26" (UID: "6adad479-9e37-4d83-b878-e22fdf6dc3d9") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:10.509422 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:10.509334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:10.509422 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:10.509381 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:10.509634 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:10.509336 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:10.509634 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:10.509466 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:10.509634 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:10.509579 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:10.509634 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:10.509626 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:12.509403 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:12.509366 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:12.509860 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:12.509366 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:12.509860 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:12.509507 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:12.509860 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:12.509366 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:12.509860 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:12.509601 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:12.509860 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:12.509667 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:14.508962 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:14.508887 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:14.508962 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:14.508910 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:14.509472 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:14.508895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:14.509472 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:14.509020 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:14.509472 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:14.509099 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:14.509472 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:14.509184 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:16.153182 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:16.152986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:16.153182 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.153164 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:16.153716 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.153251 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:32.153229137 +0000 UTC m=+34.219442198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:16.254360 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:16.254310 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:16.254536 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.254512 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:16.254600 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.254538 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:16.254600 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.254552 2571 projected.go:194] Error preparing data for projected volume kube-api-access-9x524 for pod openshift-network-diagnostics/network-check-target-sd9g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:16.254666 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.254621 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524 podName:743f27e4-2f91-43c9-a360-363424f5563c nodeName:}" failed. No retries permitted until 2026-04-22 18:36:32.254600738 +0000 UTC m=+34.320813803 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9x524" (UniqueName: "kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524") pod "network-check-target-sd9g2" (UID: "743f27e4-2f91-43c9-a360-363424f5563c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:16.508730 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:16.508654 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:16.508898 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:16.508654 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:16.508898 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.508769 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:16.508898 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:16.508661 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:16.508898 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.508835 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:16.509050 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:16.508928 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:18.270338 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:18.270299 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:18.270663 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:18.270442 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:18.270663 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:18.270507 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret podName:6adad479-9e37-4d83-b878-e22fdf6dc3d9 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:34.270490368 +0000 UTC m=+36.336703419 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret") pod "global-pull-secret-syncer-hlw26" (UID: "6adad479-9e37-4d83-b878-e22fdf6dc3d9") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:18.509812 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:18.509735 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:18.509979 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:18.509848 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:18.509979 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:18.509924 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:18.510053 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:18.510009 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:18.510053 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:18.510044 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:18.510114 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:18.510091 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:19.616108 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.615887 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"df2ff869643a63a9fd677d5bf8056c0ba3e9ab61014fa86d3cc5f6cfc87cf8a9"} Apr 22 18:36:19.616794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.616117 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"70948d31f6f73d70a9b0f19dc10b7e447d23522a1a38a3bb2731eb6e3f6a29fc"} Apr 22 18:36:19.616794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.616133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"7450d3adbd698504aa29bc4957b2eb86374b482b5c85116e2ac7ea479fbb82ca"} Apr 22 18:36:19.616794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.616175 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"c2725ea0ba9349156b08f600f0da207313e2b26dd20af9edb0c5cde21f04036e"} Apr 22 18:36:19.616794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.616189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"94622eb5e021f332df707774a18aed470bf79182d98f803848fc2b7d3a863d9f"} Apr 22 18:36:19.616794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.616200 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"a3441a30e5076170ee8be7d7fed800c6947aa8c5372814f11dffffa04e01a6e1"} Apr 22 18:36:19.617085 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.617063 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" event={"ID":"5ee5dbab-f80b-4ccd-97c3-0fca5299a371","Type":"ContainerStarted","Data":"7af9c6ae3446c4deabf82c187dcb26555284614521f5a365ebc8db21f5e2d059"} Apr 22 18:36:19.618136 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.618114 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-47tpt" event={"ID":"2bd7e4a3-0f74-4ca3-848f-f713afd48c22","Type":"ContainerStarted","Data":"0fc9b0401fc1c230fa7fa519b1978bbb52974b569fbd36694d241ca821a7b4eb"} Apr 22 18:36:19.619441 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.619422 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1680f68-7360-441e-8836-c911ac062e82" containerID="61636178ffa42d4b5f0d77910df45bc61abcce4ef9041e2d1acabb93af8aeabb" exitCode=0 Apr 22 18:36:19.619517 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.619486 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" event={"ID":"b1680f68-7360-441e-8836-c911ac062e82","Type":"ContainerDied","Data":"61636178ffa42d4b5f0d77910df45bc61abcce4ef9041e2d1acabb93af8aeabb"} Apr 22 18:36:19.620785 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.620767 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2df55" event={"ID":"20357bd7-0dd6-4792-92d1-d5705073a86e","Type":"ContainerStarted","Data":"56a96363d62ceef3cf287abb73df4bd4d7cff6ba5e1a0f7bfaedbfd047ffea84"} Apr 22 18:36:19.622183 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.622163 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" event={"ID":"bcfd0f0f-fa19-459b-9541-ffe992fad530","Type":"ContainerStarted","Data":"8a44b4002ea1ac284232e9a694db0e44c45b5622e8aa24625bf007f0507afc48"} Apr 22 18:36:19.623446 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.623426 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ql2h8" event={"ID":"539b75c3-f4eb-4f78-bdd1-438056001519","Type":"ContainerStarted","Data":"fd0f479d9976439c86589339de5612348c0e9cba0ecdbaebfa91002858da0a41"} Apr 22 18:36:19.624595 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.624578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-llq6g" event={"ID":"dcc5a374-518f-4fe9-b03c-6c43c581735c","Type":"ContainerStarted","Data":"23206465d2f1445089dd591b2d2d2a00b2d5ac493441454fb08d98600fb0d4ec"} Apr 22 18:36:19.630789 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.630752 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-126.ec2.internal" podStartSLOduration=20.630740207 podStartE2EDuration="20.630740207s" podCreationTimestamp="2026-04-22 18:35:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:03.600196259 +0000 UTC m=+5.666409329" watchObservedRunningTime="2026-04-22 18:36:19.630740207 +0000 UTC m=+21.696953275" Apr 22 18:36:19.631181 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.631156 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-47tpt" podStartSLOduration=4.345875577 podStartE2EDuration="21.631151061s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.306904203 +0000 UTC m=+3.373117259" lastFinishedPulling="2026-04-22 18:36:18.592179696 +0000 UTC m=+20.658392743" observedRunningTime="2026-04-22 18:36:19.630826158 +0000 UTC m=+21.697039250" watchObservedRunningTime="2026-04-22 18:36:19.631151061 +0000 UTC m=+21.697364130" Apr 22 18:36:19.648019 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.647973 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bwbj7" podStartSLOduration=4.366017676 podStartE2EDuration="21.64795984s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.311733119 +0000 UTC m=+3.377946166" lastFinishedPulling="2026-04-22 18:36:18.593675268 +0000 UTC m=+20.659888330" observedRunningTime="2026-04-22 18:36:19.647377465 +0000 UTC m=+21.713590534" watchObservedRunningTime="2026-04-22 18:36:19.64795984 +0000 UTC m=+21.714172909" Apr 22 18:36:19.659790 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.659737 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-llq6g" podStartSLOduration=4.378865677 podStartE2EDuration="21.65972468s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.310981614 +0000 UTC m=+3.377194664" lastFinishedPulling="2026-04-22 18:36:18.591840615 +0000 UTC m=+20.658053667" observedRunningTime="2026-04-22 18:36:19.659544636 +0000 UTC m=+21.725757717" watchObservedRunningTime="2026-04-22 18:36:19.65972468 +0000 UTC m=+21.725937748" Apr 22 18:36:19.703297 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.703239 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2df55" podStartSLOduration=12.699281127 podStartE2EDuration="21.703223617s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.312804123 +0000 UTC m=+3.379017169" lastFinishedPulling="2026-04-22 18:36:10.316746601 +0000 UTC m=+12.382959659" observedRunningTime="2026-04-22 18:36:19.702828374 +0000 UTC m=+21.769041444" watchObservedRunningTime="2026-04-22 18:36:19.703223617 +0000 UTC m=+21.769436686" Apr 22 18:36:19.717798 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:19.717744 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ql2h8" podStartSLOduration=4.421918194 podStartE2EDuration="21.717727852s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.306101872 +0000 UTC m=+3.372314933" lastFinishedPulling="2026-04-22 18:36:18.601911538 +0000 UTC m=+20.668124591" observedRunningTime="2026-04-22 18:36:19.71716051 +0000 UTC m=+21.783373580" watchObservedRunningTime="2026-04-22 18:36:19.717727852 +0000 UTC m=+21.783940921" Apr 22 18:36:20.508634 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:20.508604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:20.508882 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:20.508739 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:20.509394 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:20.509119 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:20.509394 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:20.509245 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:20.509394 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:20.509294 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:20.509394 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:20.509359 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:20.630459 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:20.630198 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xwhrw" event={"ID":"481b4ac3-0a49-4659-8d45-8a350c162d28","Type":"ContainerStarted","Data":"c4dc7c2d7792821d7d4216b64b62049e47ae606f59565933d8f35c35f686baea"} Apr 22 18:36:20.640845 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:20.640819 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:36:20.658197 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:20.658149 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xwhrw" podStartSLOduration=4.3975926 podStartE2EDuration="21.658129447s" podCreationTimestamp="2026-04-22 18:35:59 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.310230089 +0000 UTC m=+3.376443135" lastFinishedPulling="2026-04-22 18:36:18.570766921 +0000 UTC m=+20.636979982" observedRunningTime="2026-04-22 18:36:20.658060945 +0000 UTC m=+22.724274013" watchObservedRunningTime="2026-04-22 18:36:20.658129447 +0000 UTC m=+22.724342518" Apr 22 18:36:21.482150 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.482029 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:36:20.640839438Z","UUID":"87d79eef-a2d5-47a6-866d-461edd5bc36e","Handler":null,"Name":"","Endpoint":""} Apr 22 18:36:21.486935 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.486898 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:36:21.486935 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.486935 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:36:21.635812 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.635589 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"8ef3efdb2b91cf13f85a36fcf96f1748f0afb83b585a928a95279a7e04d836ac"} Apr 22 18:36:21.637948 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.637881 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" event={"ID":"5ee5dbab-f80b-4ccd-97c3-0fca5299a371","Type":"ContainerStarted","Data":"338f47181468690e9af9f9aef0005252bfc4a68e3e3d1eebe0f838f0d0fde29b"} Apr 22 18:36:21.637948 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.637925 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" event={"ID":"5ee5dbab-f80b-4ccd-97c3-0fca5299a371","Type":"ContainerStarted","Data":"76c973f4210f8acd35209d5999c408efe9f27ef62ab3056d3e0044a5b4443801"} Apr 22 18:36:21.657658 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.657607 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hcdjz" podStartSLOduration=2.5280771399999997 podStartE2EDuration="22.657590688s" podCreationTimestamp="2026-04-22 18:35:59 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.307923774 +0000 UTC m=+3.374136834" lastFinishedPulling="2026-04-22 18:36:21.437437322 +0000 UTC m=+23.503650382" observedRunningTime="2026-04-22 18:36:21.657446747 +0000 UTC m=+23.723659817" watchObservedRunningTime="2026-04-22 18:36:21.657590688 +0000 UTC m=+23.723803758" Apr 22 18:36:21.807030 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.806982 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:21.807731 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:21.807705 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:22.509099 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:22.509063 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:22.509099 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:22.509086 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:22.509313 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:22.509111 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:22.509313 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:22.509209 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:22.509391 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:22.509320 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:22.509454 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:22.509435 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:22.640457 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:22.640418 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:22.641128 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:22.640841 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-llq6g" Apr 22 18:36:24.509015 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.508928 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:24.509516 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.508929 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:24.509516 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:24.509043 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:24.509516 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.508929 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:24.509516 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:24.509115 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:24.509516 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:24.509213 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:24.647699 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.647653 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" event={"ID":"578f7fc7-df68-49e8-ab1c-e8782370ea85","Type":"ContainerStarted","Data":"e7c14bc133e2e2b05e4a975c364ea3c5bb087fabfafd61f56c89f8895281ddc0"} Apr 22 18:36:24.648694 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.648664 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:24.652204 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.652179 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1680f68-7360-441e-8836-c911ac062e82" containerID="338416f7561f35d59b26ef53ec66b733d7d5fde90bf58fce24f360e79d6b40a0" exitCode=0 Apr 22 18:36:24.652322 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.652246 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" event={"ID":"b1680f68-7360-441e-8836-c911ac062e82","Type":"ContainerDied","Data":"338416f7561f35d59b26ef53ec66b733d7d5fde90bf58fce24f360e79d6b40a0"} Apr 22 18:36:24.665806 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.665783 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:24.675917 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:24.675856 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" podStartSLOduration=9.05935027 podStartE2EDuration="26.675842755s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.30864213 +0000 UTC m=+3.374855178" lastFinishedPulling="2026-04-22 18:36:18.925134613 +0000 UTC m=+20.991347663" observedRunningTime="2026-04-22 18:36:24.675542083 +0000 UTC m=+26.741755152" watchObservedRunningTime="2026-04-22 18:36:24.675842755 +0000 UTC m=+26.742055824" Apr 22 18:36:25.653951 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.653923 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:25.654405 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.653965 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:25.668136 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.668112 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:36:25.868599 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.868566 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sd9g2"] Apr 22 18:36:25.868752 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.868711 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:25.868842 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:25.868818 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:25.871085 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.871065 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hlw26"] Apr 22 18:36:25.871210 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.871173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:25.871293 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:25.871271 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:25.873900 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.873856 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8llv8"] Apr 22 18:36:25.874006 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:25.873991 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:25.874117 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:25.874098 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:26.657093 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:26.657053 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1680f68-7360-441e-8836-c911ac062e82" containerID="7013d24d7298bf2ef90b56517780d2c811e1f8505f5c774d977f49a0d5b4349d" exitCode=0 Apr 22 18:36:26.657616 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:26.657134 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" event={"ID":"b1680f68-7360-441e-8836-c911ac062e82","Type":"ContainerDied","Data":"7013d24d7298bf2ef90b56517780d2c811e1f8505f5c774d977f49a0d5b4349d"} Apr 22 18:36:27.509309 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:27.509057 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:27.509466 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:27.509057 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:27.509466 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:27.509338 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:27.509466 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:27.509057 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:27.509466 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:27.509404 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:27.509466 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:27.509466 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:27.661928 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:27.661897 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1680f68-7360-441e-8836-c911ac062e82" containerID="b31903788af9f6a1772d7edec265e9948d1434b10a741dd5fdf9625943fe00f3" exitCode=0 Apr 22 18:36:27.662316 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:27.661980 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" event={"ID":"b1680f68-7360-441e-8836-c911ac062e82","Type":"ContainerDied","Data":"b31903788af9f6a1772d7edec265e9948d1434b10a741dd5fdf9625943fe00f3"} Apr 22 18:36:29.509395 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:29.509351 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:29.509395 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:29.509375 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:29.509988 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:29.509351 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:29.509988 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:29.509476 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:29.509988 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:29.509563 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:29.509988 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:29.509646 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:31.508962 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.508925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:31.509548 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.508925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:31.509548 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:31.509071 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:36:31.509548 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.508925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:31.509548 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:31.509159 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hlw26" podUID="6adad479-9e37-4d83-b878-e22fdf6dc3d9" Apr 22 18:36:31.509548 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:31.509205 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sd9g2" podUID="743f27e4-2f91-43c9-a360-363424f5563c" Apr 22 18:36:31.756276 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.756244 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-126.ec2.internal" event="NodeReady" Apr 22 18:36:31.756483 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.756403 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:36:31.794127 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.794081 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs"] Apr 22 18:36:31.818692 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.818646 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57bd58d489-zrjpv"] Apr 22 18:36:31.818917 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.818835 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" Apr 22 18:36:31.824820 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.824261 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:36:31.824820 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.824180 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4xx5w\"" Apr 22 18:36:31.825036 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.824931 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:36:31.825036 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.824975 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:36:31.825618 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.825596 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:36:31.833931 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.833906 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc"] Apr 22 18:36:31.834186 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.834166 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:31.836843 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.836808 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vvpgg\"" Apr 22 18:36:31.837108 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.837089 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:36:31.837280 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.837261 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:36:31.837568 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.837549 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:36:31.843385 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.843362 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:36:31.849830 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.849801 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f"] Apr 22 18:36:31.849994 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.849976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:31.852587 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.852568 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:36:31.852687 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.852603 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:36:31.852687 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.852671 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:36:31.853106 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.853089 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:36:31.869750 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.869727 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs"] Apr 22 18:36:31.869750 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.869750 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc"] Apr 22 18:36:31.869951 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.869760 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57bd58d489-zrjpv"] Apr 22 18:36:31.869951 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.869776 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pfpxj"] Apr 22 18:36:31.869951 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.869861 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:31.872337 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.872321 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:36:31.887029 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.887011 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v26l6"] Apr 22 18:36:31.887184 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.887166 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:31.889530 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.889508 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:36:31.889638 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.889516 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:36:31.889638 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.889573 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4wrcz\"" Apr 22 18:36:31.907925 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.907862 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f"] Apr 22 18:36:31.908060 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.907933 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v26l6"] Apr 22 18:36:31.908060 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.907946 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pfpxj"] Apr 22 18:36:31.908060 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.908007 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:31.910787 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.910561 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:36:31.910787 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.910587 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:36:31.910787 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.910689 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9qw5s\"" Apr 22 18:36:31.910787 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.910711 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:36:31.980617 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980528 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a10495a7-5071-4ef9-bc74-dad139a44eb8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b4b69d795-pw6qs\" (UID: \"a10495a7-5071-4ef9-bc74-dad139a44eb8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" Apr 22 18:36:31.980617 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980574 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw972\" (UniqueName: \"kubernetes.io/projected/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-kube-api-access-cw972\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:31.980617 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980603 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc7162b6-24ec-432c-8317-6eff35ee7f87-config-volume\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:31.980924 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980676 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:31.980924 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-tmp\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:31.980924 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stz88\" (UniqueName: \"kubernetes.io/projected/a10495a7-5071-4ef9-bc74-dad139a44eb8-kube-api-access-stz88\") pod \"managed-serviceaccount-addon-agent-6b4b69d795-pw6qs\" (UID: \"a10495a7-5071-4ef9-bc74-dad139a44eb8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" Apr 22 18:36:31.980924 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980752 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-image-registry-private-configuration\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:31.980924 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980885 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/432cf752-8129-4b79-956e-b18dc2cdebbb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:31.980924 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980922 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-252xh\" (UniqueName: \"kubernetes.io/projected/432cf752-8129-4b79-956e-b18dc2cdebbb-kube-api-access-252xh\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980941 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55f34127-b675-4740-a1a6-5c477c79f2f8-ca-trust-extracted\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980970 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-ca\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.980994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9zn7\" (UniqueName: \"kubernetes.io/projected/cc7162b6-24ec-432c-8317-6eff35ee7f87-kube-api-access-w9zn7\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981018 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-certificates\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981044 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-installation-pull-secrets\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-hub\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981106 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vc7z\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-kube-api-access-2vc7z\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981142 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:31.981235 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:31.981658 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-klusterlet-config\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:31.981658 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc7162b6-24ec-432c-8317-6eff35ee7f87-tmp-dir\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:31.981658 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981341 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:31.981658 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-trusted-ca\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:31.981658 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:31.981396 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-bound-sa-token\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.081960 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.081920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a10495a7-5071-4ef9-bc74-dad139a44eb8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b4b69d795-pw6qs\" (UID: \"a10495a7-5071-4ef9-bc74-dad139a44eb8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" Apr 22 18:36:32.082169 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.081990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw972\" (UniqueName: \"kubernetes.io/projected/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-kube-api-access-cw972\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:32.082169 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc7162b6-24ec-432c-8317-6eff35ee7f87-config-volume\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:32.082169 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082048 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.082169 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082072 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-tmp\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:32.082169 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082106 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stz88\" (UniqueName: \"kubernetes.io/projected/a10495a7-5071-4ef9-bc74-dad139a44eb8-kube-api-access-stz88\") pod \"managed-serviceaccount-addon-agent-6b4b69d795-pw6qs\" (UID: \"a10495a7-5071-4ef9-bc74-dad139a44eb8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" Apr 22 18:36:32.082169 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082134 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:32.082169 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-image-registry-private-configuration\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.082192 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082209 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/432cf752-8129-4b79-956e-b18dc2cdebbb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.082218 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57bd58d489-zrjpv: secret "image-registry-tls" not found Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-252xh\" (UniqueName: \"kubernetes.io/projected/432cf752-8129-4b79-956e-b18dc2cdebbb-kube-api-access-252xh\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082264 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55f34127-b675-4740-a1a6-5c477c79f2f8-ca-trust-extracted\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.082297 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls podName:55f34127-b675-4740-a1a6-5c477c79f2f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:32.582275236 +0000 UTC m=+34.648488284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls") pod "image-registry-57bd58d489-zrjpv" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8") : secret "image-registry-tls" not found Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082342 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-ca\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9zn7\" (UniqueName: \"kubernetes.io/projected/cc7162b6-24ec-432c-8317-6eff35ee7f87-kube-api-access-w9zn7\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082395 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpbr7\" (UniqueName: \"kubernetes.io/projected/8e29b790-9fe8-4412-be58-f5c6ee203578-kube-api-access-rpbr7\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-certificates\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082455 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-installation-pull-secrets\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.082509 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-hub\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.083157 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082517 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vc7z\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-kube-api-access-2vc7z\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.083157 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082515 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-tmp\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:32.083157 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55f34127-b675-4740-a1a6-5c477c79f2f8-ca-trust-extracted\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.083157 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.082681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc7162b6-24ec-432c-8317-6eff35ee7f87-config-volume\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:32.083157 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.083064 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-certificates\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.083409 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.083184 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.083409 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.083232 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:32.083819 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.083795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/432cf752-8129-4b79-956e-b18dc2cdebbb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.083948 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.083878 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:32.083948 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.083895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-klusterlet-config\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:32.083948 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.083923 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc7162b6-24ec-432c-8317-6eff35ee7f87-tmp-dir\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:32.083948 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.083947 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls podName:cc7162b6-24ec-432c-8317-6eff35ee7f87 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:32.583927894 +0000 UTC m=+34.650140956 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls") pod "dns-default-pfpxj" (UID: "cc7162b6-24ec-432c-8317-6eff35ee7f87") : secret "dns-default-metrics-tls" not found Apr 22 18:36:32.084156 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.083993 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.084156 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.084021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-trusted-ca\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.084156 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.084054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-bound-sa-token\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.084299 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.084152 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cc7162b6-24ec-432c-8317-6eff35ee7f87-tmp-dir\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:32.087373 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.087348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-ca\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.087486 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.087380 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.087546 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.087528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.088080 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.088058 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-image-registry-private-configuration\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.089368 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.089311 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-klusterlet-config\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:32.089489 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.089433 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-installation-pull-secrets\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.089956 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.089908 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/432cf752-8129-4b79-956e-b18dc2cdebbb-hub\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.092159 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.092134 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw972\" (UniqueName: \"kubernetes.io/projected/ff344b43-41d5-4c2c-959c-55b5f3eb6e0d-kube-api-access-cw972\") pod \"klusterlet-addon-workmgr-84c68797fc-nq28f\" (UID: \"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:32.093025 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.093002 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vc7z\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-kube-api-access-2vc7z\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.093474 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.093434 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-252xh\" (UniqueName: \"kubernetes.io/projected/432cf752-8129-4b79-956e-b18dc2cdebbb-kube-api-access-252xh\") pod \"cluster-proxy-proxy-agent-844dbf5558-hhvkc\" (UID: \"432cf752-8129-4b79-956e-b18dc2cdebbb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.093580 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.093559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-bound-sa-token\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.093644 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.093615 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-trusted-ca\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.093644 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.093633 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9zn7\" (UniqueName: \"kubernetes.io/projected/cc7162b6-24ec-432c-8317-6eff35ee7f87-kube-api-access-w9zn7\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:32.098404 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.098382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a10495a7-5071-4ef9-bc74-dad139a44eb8-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b4b69d795-pw6qs\" (UID: \"a10495a7-5071-4ef9-bc74-dad139a44eb8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" Apr 22 18:36:32.098510 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.098391 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stz88\" (UniqueName: \"kubernetes.io/projected/a10495a7-5071-4ef9-bc74-dad139a44eb8-kube-api-access-stz88\") pod \"managed-serviceaccount-addon-agent-6b4b69d795-pw6qs\" (UID: \"a10495a7-5071-4ef9-bc74-dad139a44eb8\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" Apr 22 18:36:32.142720 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.142686 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" Apr 22 18:36:32.159746 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.159715 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:36:32.184689 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.184647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:32.184850 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.184773 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:32.184850 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.184800 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:32.184850 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.184825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpbr7\" (UniqueName: \"kubernetes.io/projected/8e29b790-9fe8-4412-be58-f5c6ee203578-kube-api-access-rpbr7\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:32.185036 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.184904 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:04.184884276 +0000 UTC m=+66.251097336 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:32.185036 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.184948 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:32.185036 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.185019 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert podName:8e29b790-9fe8-4412-be58-f5c6ee203578 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:32.685003934 +0000 UTC m=+34.751216988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert") pod "ingress-canary-v26l6" (UID: "8e29b790-9fe8-4412-be58-f5c6ee203578") : secret "canary-serving-cert" not found Apr 22 18:36:32.193925 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.193902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpbr7\" (UniqueName: \"kubernetes.io/projected/8e29b790-9fe8-4412-be58-f5c6ee203578-kube-api-access-rpbr7\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:32.195627 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.195602 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:32.285947 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.285848 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:32.286132 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.286049 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:32.286132 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.286076 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:32.286132 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.286090 2571 projected.go:194] Error preparing data for projected volume kube-api-access-9x524 for pod openshift-network-diagnostics/network-check-target-sd9g2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:32.286270 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.286169 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524 podName:743f27e4-2f91-43c9-a360-363424f5563c nodeName:}" failed. No retries permitted until 2026-04-22 18:37:04.286148883 +0000 UTC m=+66.352361933 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9x524" (UniqueName: "kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524") pod "network-check-target-sd9g2" (UID: "743f27e4-2f91-43c9-a360-363424f5563c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:32.588772 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.588680 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:32.589363 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.588811 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:32.589363 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.588824 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:32.589363 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.588844 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57bd58d489-zrjpv: secret "image-registry-tls" not found Apr 22 18:36:32.589363 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.588937 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls podName:55f34127-b675-4740-a1a6-5c477c79f2f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:33.588913862 +0000 UTC m=+35.655126914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls") pod "image-registry-57bd58d489-zrjpv" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8") : secret "image-registry-tls" not found Apr 22 18:36:32.589363 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.588971 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:32.589363 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.589031 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls podName:cc7162b6-24ec-432c-8317-6eff35ee7f87 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:33.589015443 +0000 UTC m=+35.655228494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls") pod "dns-default-pfpxj" (UID: "cc7162b6-24ec-432c-8317-6eff35ee7f87") : secret "dns-default-metrics-tls" not found Apr 22 18:36:32.690124 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:32.690087 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:32.690336 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.690258 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:32.690393 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:32.690337 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert podName:8e29b790-9fe8-4412-be58-f5c6ee203578 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:33.690317089 +0000 UTC m=+35.756530145 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert") pod "ingress-canary-v26l6" (UID: "8e29b790-9fe8-4412-be58-f5c6ee203578") : secret "canary-serving-cert" not found Apr 22 18:36:33.508906 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.508877 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:36:33.509077 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.508975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:33.509144 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.509110 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:36:33.512771 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.512737 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:36:33.512771 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.512760 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:36:33.512970 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.512786 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:36:33.512970 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.512810 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfh6w\"" Apr 22 18:36:33.512970 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.512786 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:36:33.512970 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.512745 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vgxpp\"" Apr 22 18:36:33.597448 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.597413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:33.597859 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.597524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:33.597859 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:33.597688 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:33.597859 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:33.597755 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls podName:cc7162b6-24ec-432c-8317-6eff35ee7f87 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:35.597735515 +0000 UTC m=+37.663948564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls") pod "dns-default-pfpxj" (UID: "cc7162b6-24ec-432c-8317-6eff35ee7f87") : secret "dns-default-metrics-tls" not found Apr 22 18:36:33.598316 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:33.598178 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:33.598316 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:33.598200 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57bd58d489-zrjpv: secret "image-registry-tls" not found Apr 22 18:36:33.598316 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:33.598290 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls podName:55f34127-b675-4740-a1a6-5c477c79f2f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:35.598271647 +0000 UTC m=+37.664484701 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls") pod "image-registry-57bd58d489-zrjpv" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8") : secret "image-registry-tls" not found Apr 22 18:36:33.690636 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.690609 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f"] Apr 22 18:36:33.698991 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.698961 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:33.699112 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:33.699073 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:33.699169 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:33.699142 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert podName:8e29b790-9fe8-4412-be58-f5c6ee203578 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:35.699124096 +0000 UTC m=+37.765337151 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert") pod "ingress-canary-v26l6" (UID: "8e29b790-9fe8-4412-be58-f5c6ee203578") : secret "canary-serving-cert" not found Apr 22 18:36:33.700671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.700648 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc"] Apr 22 18:36:33.704331 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:33.704312 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs"] Apr 22 18:36:33.779101 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:33.778967 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff344b43_41d5_4c2c_959c_55b5f3eb6e0d.slice/crio-ac9039c3f8ce0eb36dde1253efb49e3af3f032d6ea6965ede7920649eae4ba1d WatchSource:0}: Error finding container ac9039c3f8ce0eb36dde1253efb49e3af3f032d6ea6965ede7920649eae4ba1d: Status 404 returned error can't find the container with id ac9039c3f8ce0eb36dde1253efb49e3af3f032d6ea6965ede7920649eae4ba1d Apr 22 18:36:33.779702 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:33.779654 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod432cf752_8129_4b79_956e_b18dc2cdebbb.slice/crio-1d16eac9c5e48805139aff3ec2ef2ffd7ff3f373fcd5946887f006f0836c94ad WatchSource:0}: Error finding container 1d16eac9c5e48805139aff3ec2ef2ffd7ff3f373fcd5946887f006f0836c94ad: Status 404 returned error can't find the container with id 1d16eac9c5e48805139aff3ec2ef2ffd7ff3f373fcd5946887f006f0836c94ad Apr 22 18:36:33.780588 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:33.780477 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda10495a7_5071_4ef9_bc74_dad139a44eb8.slice/crio-94b8bd230650313f04e39d8182f97a35c5d344b65f7732cfb7f040073fe35ef1 WatchSource:0}: Error finding container 94b8bd230650313f04e39d8182f97a35c5d344b65f7732cfb7f040073fe35ef1: Status 404 returned error can't find the container with id 94b8bd230650313f04e39d8182f97a35c5d344b65f7732cfb7f040073fe35ef1 Apr 22 18:36:34.304096 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.304003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:34.307498 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.307473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6adad479-9e37-4d83-b878-e22fdf6dc3d9-original-pull-secret\") pod \"global-pull-secret-syncer-hlw26\" (UID: \"6adad479-9e37-4d83-b878-e22fdf6dc3d9\") " pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:34.426346 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.426306 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hlw26" Apr 22 18:36:34.547304 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.547275 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hlw26"] Apr 22 18:36:34.550966 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:36:34.550935 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6adad479_9e37_4d83_b878_e22fdf6dc3d9.slice/crio-fac20c2fc783b88fd0fb80bdc7b18ec41f007bb6b5b5b77adde7b1206f4b85fe WatchSource:0}: Error finding container fac20c2fc783b88fd0fb80bdc7b18ec41f007bb6b5b5b77adde7b1206f4b85fe: Status 404 returned error can't find the container with id fac20c2fc783b88fd0fb80bdc7b18ec41f007bb6b5b5b77adde7b1206f4b85fe Apr 22 18:36:34.680201 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.680166 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1680f68-7360-441e-8836-c911ac062e82" containerID="e3c4ed4b5e4766d5e65bf7cbc479e71752123d76063311c61d516adab58a5de1" exitCode=0 Apr 22 18:36:34.680633 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.680245 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" event={"ID":"b1680f68-7360-441e-8836-c911ac062e82","Type":"ContainerDied","Data":"e3c4ed4b5e4766d5e65bf7cbc479e71752123d76063311c61d516adab58a5de1"} Apr 22 18:36:34.681220 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.681196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hlw26" event={"ID":"6adad479-9e37-4d83-b878-e22fdf6dc3d9","Type":"ContainerStarted","Data":"fac20c2fc783b88fd0fb80bdc7b18ec41f007bb6b5b5b77adde7b1206f4b85fe"} Apr 22 18:36:34.682207 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.682177 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" event={"ID":"a10495a7-5071-4ef9-bc74-dad139a44eb8","Type":"ContainerStarted","Data":"94b8bd230650313f04e39d8182f97a35c5d344b65f7732cfb7f040073fe35ef1"} Apr 22 18:36:34.683147 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.683124 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" event={"ID":"432cf752-8129-4b79-956e-b18dc2cdebbb","Type":"ContainerStarted","Data":"1d16eac9c5e48805139aff3ec2ef2ffd7ff3f373fcd5946887f006f0836c94ad"} Apr 22 18:36:34.684078 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:34.684060 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" event={"ID":"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d","Type":"ContainerStarted","Data":"ac9039c3f8ce0eb36dde1253efb49e3af3f032d6ea6965ede7920649eae4ba1d"} Apr 22 18:36:35.616459 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:35.616418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:35.616644 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:35.616494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:35.616721 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:35.616675 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:35.616721 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:35.616691 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57bd58d489-zrjpv: secret "image-registry-tls" not found Apr 22 18:36:35.616831 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:35.616752 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls podName:55f34127-b675-4740-a1a6-5c477c79f2f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:39.616733814 +0000 UTC m=+41.682946865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls") pod "image-registry-57bd58d489-zrjpv" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8") : secret "image-registry-tls" not found Apr 22 18:36:35.616987 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:35.616970 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:35.617053 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:35.617018 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls podName:cc7162b6-24ec-432c-8317-6eff35ee7f87 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:39.617004059 +0000 UTC m=+41.683217112 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls") pod "dns-default-pfpxj" (UID: "cc7162b6-24ec-432c-8317-6eff35ee7f87") : secret "dns-default-metrics-tls" not found Apr 22 18:36:35.698163 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:35.698068 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1680f68-7360-441e-8836-c911ac062e82" containerID="37543a9c6c6c851213f33014a8850d080e59a86417a5aed30a58149469d1a77a" exitCode=0 Apr 22 18:36:35.698163 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:35.698146 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" event={"ID":"b1680f68-7360-441e-8836-c911ac062e82","Type":"ContainerDied","Data":"37543a9c6c6c851213f33014a8850d080e59a86417a5aed30a58149469d1a77a"} Apr 22 18:36:35.716984 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:35.716949 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:35.717149 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:35.717096 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:35.717210 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:35.717152 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert podName:8e29b790-9fe8-4412-be58-f5c6ee203578 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:39.717134321 +0000 UTC m=+41.783347372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert") pod "ingress-canary-v26l6" (UID: "8e29b790-9fe8-4412-be58-f5c6ee203578") : secret "canary-serving-cert" not found Apr 22 18:36:36.705183 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:36.705143 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" event={"ID":"b1680f68-7360-441e-8836-c911ac062e82","Type":"ContainerStarted","Data":"f4aa222f80e9861d584995d4754d5857b734a6fec0343d9a774749ea57552bff"} Apr 22 18:36:36.731183 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:36.730960 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d5f7s" podStartSLOduration=6.221423461 podStartE2EDuration="38.730940453s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:36:01.301414037 +0000 UTC m=+3.367627083" lastFinishedPulling="2026-04-22 18:36:33.810931024 +0000 UTC m=+35.877144075" observedRunningTime="2026-04-22 18:36:36.729583999 +0000 UTC m=+38.795797069" watchObservedRunningTime="2026-04-22 18:36:36.730940453 +0000 UTC m=+38.797153522" Apr 22 18:36:39.652481 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:39.652441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:39.653127 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:39.652495 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:39.653127 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:39.652606 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:39.653127 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:39.652607 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:39.653127 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:39.652687 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls podName:cc7162b6-24ec-432c-8317-6eff35ee7f87 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.652671927 +0000 UTC m=+49.718884979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls") pod "dns-default-pfpxj" (UID: "cc7162b6-24ec-432c-8317-6eff35ee7f87") : secret "dns-default-metrics-tls" not found Apr 22 18:36:39.653127 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:39.652617 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57bd58d489-zrjpv: secret "image-registry-tls" not found Apr 22 18:36:39.653127 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:39.652746 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls podName:55f34127-b675-4740-a1a6-5c477c79f2f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.652729794 +0000 UTC m=+49.718942853 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls") pod "image-registry-57bd58d489-zrjpv" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8") : secret "image-registry-tls" not found Apr 22 18:36:39.752998 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:39.752953 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:39.753172 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:39.753092 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:39.753219 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:39.753182 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert podName:8e29b790-9fe8-4412-be58-f5c6ee203578 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.753166513 +0000 UTC m=+49.819379560 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert") pod "ingress-canary-v26l6" (UID: "8e29b790-9fe8-4412-be58-f5c6ee203578") : secret "canary-serving-cert" not found Apr 22 18:36:42.718736 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.718643 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hlw26" event={"ID":"6adad479-9e37-4d83-b878-e22fdf6dc3d9","Type":"ContainerStarted","Data":"dc0a263dd921117ccaafa03199dbc32abee0b94361b85a98297f4431b6ea7129"} Apr 22 18:36:42.720047 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.720014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" event={"ID":"a10495a7-5071-4ef9-bc74-dad139a44eb8","Type":"ContainerStarted","Data":"b9f70acca840a7173c2764791b8040b6589a993fad8a1fda2fa39df1201127ea"} Apr 22 18:36:42.721285 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.721260 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" event={"ID":"432cf752-8129-4b79-956e-b18dc2cdebbb","Type":"ContainerStarted","Data":"43288cb6332b17833ea4193223bb05eb34abcd36309b7479981417dfcd8c7938"} Apr 22 18:36:42.722411 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.722391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" event={"ID":"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d","Type":"ContainerStarted","Data":"d2dafc89c92797f5a696eab2aa43e42425e40ba7b2c8905513bb9b2a99037b05"} Apr 22 18:36:42.722621 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.722602 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:42.724299 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.724275 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:36:42.735117 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.735071 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hlw26" podStartSLOduration=32.983149735 podStartE2EDuration="40.735058785s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:34.553021856 +0000 UTC m=+36.619234909" lastFinishedPulling="2026-04-22 18:36:42.304930909 +0000 UTC m=+44.371143959" observedRunningTime="2026-04-22 18:36:42.734074783 +0000 UTC m=+44.800287849" watchObservedRunningTime="2026-04-22 18:36:42.735058785 +0000 UTC m=+44.801271853" Apr 22 18:36:42.750029 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.749986 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" podStartSLOduration=33.430854726 podStartE2EDuration="41.749972572s" podCreationTimestamp="2026-04-22 18:36:01 +0000 UTC" firstStartedPulling="2026-04-22 18:36:33.788482032 +0000 UTC m=+35.854695082" lastFinishedPulling="2026-04-22 18:36:42.107599882 +0000 UTC m=+44.173812928" observedRunningTime="2026-04-22 18:36:42.749265488 +0000 UTC m=+44.815478557" watchObservedRunningTime="2026-04-22 18:36:42.749972572 +0000 UTC m=+44.816185641" Apr 22 18:36:42.773418 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:42.773374 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" podStartSLOduration=33.438172449 podStartE2EDuration="41.773358801s" podCreationTimestamp="2026-04-22 18:36:01 +0000 UTC" firstStartedPulling="2026-04-22 18:36:33.788135259 +0000 UTC m=+35.854348315" lastFinishedPulling="2026-04-22 18:36:42.123321605 +0000 UTC m=+44.189534667" observedRunningTime="2026-04-22 18:36:42.77276725 +0000 UTC m=+44.838980319" watchObservedRunningTime="2026-04-22 18:36:42.773358801 +0000 UTC m=+44.839571870" Apr 22 18:36:45.731856 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:45.731822 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" event={"ID":"432cf752-8129-4b79-956e-b18dc2cdebbb","Type":"ContainerStarted","Data":"0628c8eca10b38a9d8fc0b56c2247aa7dc97fec3b60ee8d531b251940740573a"} Apr 22 18:36:45.731856 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:45.731863 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" event={"ID":"432cf752-8129-4b79-956e-b18dc2cdebbb","Type":"ContainerStarted","Data":"8cb2cffff7f806ba1776fe9ba058fbe3737ee19f922a8ccd853976cc0d825a04"} Apr 22 18:36:45.752303 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:45.752253 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" podStartSLOduration=33.541318131 podStartE2EDuration="44.752236296s" podCreationTimestamp="2026-04-22 18:36:01 +0000 UTC" firstStartedPulling="2026-04-22 18:36:33.788421326 +0000 UTC m=+35.854634382" lastFinishedPulling="2026-04-22 18:36:44.999339494 +0000 UTC m=+47.065552547" observedRunningTime="2026-04-22 18:36:45.750338417 +0000 UTC m=+47.816551486" watchObservedRunningTime="2026-04-22 18:36:45.752236296 +0000 UTC m=+47.818449364" Apr 22 18:36:47.704361 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:47.704315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:36:47.704783 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:47.704372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:36:47.704783 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:47.704482 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:47.704783 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:47.704494 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57bd58d489-zrjpv: secret "image-registry-tls" not found Apr 22 18:36:47.704783 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:47.704542 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls podName:55f34127-b675-4740-a1a6-5c477c79f2f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:03.704527692 +0000 UTC m=+65.770740739 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls") pod "image-registry-57bd58d489-zrjpv" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8") : secret "image-registry-tls" not found Apr 22 18:36:47.704783 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:47.704482 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:47.704783 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:47.704604 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls podName:cc7162b6-24ec-432c-8317-6eff35ee7f87 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:03.704591816 +0000 UTC m=+65.770804868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls") pod "dns-default-pfpxj" (UID: "cc7162b6-24ec-432c-8317-6eff35ee7f87") : secret "dns-default-metrics-tls" not found Apr 22 18:36:47.804982 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:47.804945 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:36:47.805148 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:47.805087 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:47.805195 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:36:47.805151 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert podName:8e29b790-9fe8-4412-be58-f5c6ee203578 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:03.805135915 +0000 UTC m=+65.871348968 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert") pod "ingress-canary-v26l6" (UID: "8e29b790-9fe8-4412-be58-f5c6ee203578") : secret "canary-serving-cert" not found Apr 22 18:36:57.672616 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:36:57.672583 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ncvvr" Apr 22 18:37:03.726192 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:03.726152 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:37:03.726732 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:03.726259 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:37:03.726732 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:03.726303 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:37:03.726732 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:03.726321 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57bd58d489-zrjpv: secret "image-registry-tls" not found Apr 22 18:37:03.726732 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:03.726386 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:37:03.726732 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:03.726388 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls podName:55f34127-b675-4740-a1a6-5c477c79f2f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:35.726371677 +0000 UTC m=+97.792584728 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls") pod "image-registry-57bd58d489-zrjpv" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8") : secret "image-registry-tls" not found Apr 22 18:37:03.726732 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:03.726454 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls podName:cc7162b6-24ec-432c-8317-6eff35ee7f87 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:35.726433465 +0000 UTC m=+97.792646516 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls") pod "dns-default-pfpxj" (UID: "cc7162b6-24ec-432c-8317-6eff35ee7f87") : secret "dns-default-metrics-tls" not found Apr 22 18:37:03.827583 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:03.827548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:37:03.827790 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:03.827678 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:37:03.827790 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:03.827746 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert podName:8e29b790-9fe8-4412-be58-f5c6ee203578 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:35.8277272 +0000 UTC m=+97.893940246 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert") pod "ingress-canary-v26l6" (UID: "8e29b790-9fe8-4412-be58-f5c6ee203578") : secret "canary-serving-cert" not found Apr 22 18:37:04.231652 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.231617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:37:04.234431 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.234411 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:37:04.242353 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:04.242332 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:37:04.242437 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:04.242392 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:08.242376309 +0000 UTC m=+130.308589362 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : secret "metrics-daemon-secret" not found Apr 22 18:37:04.332946 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.332910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:37:04.335944 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.335921 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:37:04.346452 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.346429 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:37:04.357707 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.357675 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x524\" (UniqueName: \"kubernetes.io/projected/743f27e4-2f91-43c9-a360-363424f5563c-kube-api-access-9x524\") pod \"network-check-target-sd9g2\" (UID: \"743f27e4-2f91-43c9-a360-363424f5563c\") " pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:37:04.421625 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.421592 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vgxpp\"" Apr 22 18:37:04.429770 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.429747 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:37:04.544822 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.544791 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sd9g2"] Apr 22 18:37:04.548045 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:37:04.548007 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod743f27e4_2f91_43c9_a360_363424f5563c.slice/crio-8aba00cf8627a374550515782c732e36e076bc8498a0cb72bd1e6ad22628eaf2 WatchSource:0}: Error finding container 8aba00cf8627a374550515782c732e36e076bc8498a0cb72bd1e6ad22628eaf2: Status 404 returned error can't find the container with id 8aba00cf8627a374550515782c732e36e076bc8498a0cb72bd1e6ad22628eaf2 Apr 22 18:37:04.783242 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:04.783157 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sd9g2" event={"ID":"743f27e4-2f91-43c9-a360-363424f5563c","Type":"ContainerStarted","Data":"8aba00cf8627a374550515782c732e36e076bc8498a0cb72bd1e6ad22628eaf2"} Apr 22 18:37:07.791583 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:07.791539 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sd9g2" event={"ID":"743f27e4-2f91-43c9-a360-363424f5563c","Type":"ContainerStarted","Data":"d851543b7252bde8fdb15bdd91069d3180ec51fdf245c44591d1f4f730031e86"} Apr 22 18:37:07.791978 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:07.791657 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:37:07.808539 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:07.808489 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sd9g2" podStartSLOduration=67.048763208 podStartE2EDuration="1m9.808476024s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:37:04.550010626 +0000 UTC m=+66.616223673" lastFinishedPulling="2026-04-22 18:37:07.309723441 +0000 UTC m=+69.375936489" observedRunningTime="2026-04-22 18:37:07.807958606 +0000 UTC m=+69.874171742" watchObservedRunningTime="2026-04-22 18:37:07.808476024 +0000 UTC m=+69.874689087" Apr 22 18:37:35.775578 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:35.775483 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:37:35.775578 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:35.775537 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:37:35.776099 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:35.775627 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:37:35.776099 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:35.775704 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls podName:cc7162b6-24ec-432c-8317-6eff35ee7f87 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:39.775686761 +0000 UTC m=+161.841899813 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls") pod "dns-default-pfpxj" (UID: "cc7162b6-24ec-432c-8317-6eff35ee7f87") : secret "dns-default-metrics-tls" not found Apr 22 18:37:35.776099 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:35.775633 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:37:35.776099 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:35.775727 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57bd58d489-zrjpv: secret "image-registry-tls" not found Apr 22 18:37:35.776099 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:35.775776 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls podName:55f34127-b675-4740-a1a6-5c477c79f2f8 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:39.775764006 +0000 UTC m=+161.841977053 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls") pod "image-registry-57bd58d489-zrjpv" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8") : secret "image-registry-tls" not found Apr 22 18:37:35.876243 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:35.876211 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:37:35.876382 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:35.876325 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:37:35.876382 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:37:35.876373 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert podName:8e29b790-9fe8-4412-be58-f5c6ee203578 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:39.876359087 +0000 UTC m=+161.942572133 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert") pod "ingress-canary-v26l6" (UID: "8e29b790-9fe8-4412-be58-f5c6ee203578") : secret "canary-serving-cert" not found Apr 22 18:37:38.796821 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:37:38.796791 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sd9g2" Apr 22 18:38:08.328407 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:08.328353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:38:08.328939 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:38:08.328503 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:38:08.328939 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:38:08.328579 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs podName:7ff729b2-3f8f-4665-ae12-c83c4b179998 nodeName:}" failed. No retries permitted until 2026-04-22 18:40:10.328563913 +0000 UTC m=+252.394776960 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs") pod "network-metrics-daemon-8llv8" (UID: "7ff729b2-3f8f-4665-ae12-c83c4b179998") : secret "metrics-daemon-secret" not found Apr 22 18:38:12.367816 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:12.367789 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-47tpt_2bd7e4a3-0f74-4ca3-848f-f713afd48c22/dns-node-resolver/0.log" Apr 22 18:38:13.367915 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:13.367885 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2df55_20357bd7-0dd6-4792-92d1-d5705073a86e/node-ca/0.log" Apr 22 18:38:34.852458 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:38:34.852393 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" podUID="55f34127-b675-4740-a1a6-5c477c79f2f8" Apr 22 18:38:34.901968 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:38:34.901924 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pfpxj" podUID="cc7162b6-24ec-432c-8317-6eff35ee7f87" Apr 22 18:38:34.917577 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:38:34.917554 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-v26l6" podUID="8e29b790-9fe8-4412-be58-f5c6ee203578" Apr 22 18:38:34.996251 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:34.996228 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:38:34.996388 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:34.996342 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pfpxj" Apr 22 18:38:34.996439 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:34.996350 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:38:35.381464 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.381430 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6lmd2"] Apr 22 18:38:35.384570 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.384544 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.387320 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.387300 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b54tz\"" Apr 22 18:38:35.387522 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.387503 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:38:35.388511 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.388495 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:38:35.388794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.388777 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:38:35.388896 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.388864 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:38:35.393558 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.393533 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6lmd2"] Apr 22 18:38:35.444521 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.444480 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2226793d-aee6-48b4-898d-378b00cbc606-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.444690 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.444552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27x4s\" (UniqueName: \"kubernetes.io/projected/2226793d-aee6-48b4-898d-378b00cbc606-kube-api-access-27x4s\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.444690 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.444600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2226793d-aee6-48b4-898d-378b00cbc606-data-volume\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.444690 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.444625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2226793d-aee6-48b4-898d-378b00cbc606-crio-socket\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.444794 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.444711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2226793d-aee6-48b4-898d-378b00cbc606-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.545448 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.545403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2226793d-aee6-48b4-898d-378b00cbc606-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.545642 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.545471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27x4s\" (UniqueName: \"kubernetes.io/projected/2226793d-aee6-48b4-898d-378b00cbc606-kube-api-access-27x4s\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.545642 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.545515 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2226793d-aee6-48b4-898d-378b00cbc606-data-volume\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.545642 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.545534 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2226793d-aee6-48b4-898d-378b00cbc606-crio-socket\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.545642 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.545567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2226793d-aee6-48b4-898d-378b00cbc606-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.545801 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.545656 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2226793d-aee6-48b4-898d-378b00cbc606-crio-socket\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.546004 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.545973 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2226793d-aee6-48b4-898d-378b00cbc606-data-volume\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.546221 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.546203 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2226793d-aee6-48b4-898d-378b00cbc606-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.547833 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.547816 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2226793d-aee6-48b4-898d-378b00cbc606-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.557920 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.557901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27x4s\" (UniqueName: \"kubernetes.io/projected/2226793d-aee6-48b4-898d-378b00cbc606-kube-api-access-27x4s\") pod \"insights-runtime-extractor-6lmd2\" (UID: \"2226793d-aee6-48b4-898d-378b00cbc606\") " pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.693972 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.693845 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6lmd2" Apr 22 18:38:35.808217 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.808186 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6lmd2"] Apr 22 18:38:35.811379 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:38:35.811348 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2226793d_aee6_48b4_898d_378b00cbc606.slice/crio-9debebe99d3564d5d9cfc8222b6dad2833f349678ad7a61cc401b250c28ef44f WatchSource:0}: Error finding container 9debebe99d3564d5d9cfc8222b6dad2833f349678ad7a61cc401b250c28ef44f: Status 404 returned error can't find the container with id 9debebe99d3564d5d9cfc8222b6dad2833f349678ad7a61cc401b250c28ef44f Apr 22 18:38:36.000031 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.999938 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6lmd2" event={"ID":"2226793d-aee6-48b4-898d-378b00cbc606","Type":"ContainerStarted","Data":"df2c562823fa3e3b9bde393350ae14079b416d8e2e629fcec7b8feb25ce63626"} Apr 22 18:38:36.000031 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:35.999977 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6lmd2" event={"ID":"2226793d-aee6-48b4-898d-378b00cbc606","Type":"ContainerStarted","Data":"9debebe99d3564d5d9cfc8222b6dad2833f349678ad7a61cc401b250c28ef44f"} Apr 22 18:38:36.530936 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:38:36.530901 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-8llv8" podUID="7ff729b2-3f8f-4665-ae12-c83c4b179998" Apr 22 18:38:37.003823 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:37.003791 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6lmd2" event={"ID":"2226793d-aee6-48b4-898d-378b00cbc606","Type":"ContainerStarted","Data":"2b5afaba98fe939e3df83039d74934139861ddbfb3567bc5813dbda571689552"} Apr 22 18:38:39.012283 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.012238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6lmd2" event={"ID":"2226793d-aee6-48b4-898d-378b00cbc606","Type":"ContainerStarted","Data":"4ce432d240695d1f10479e6d3dafe8dd1cb9953fd9049cdaf096754283cc5265"} Apr 22 18:38:39.037970 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.037920 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6lmd2" podStartSLOduration=1.690247141 podStartE2EDuration="4.037906346s" podCreationTimestamp="2026-04-22 18:38:35 +0000 UTC" firstStartedPulling="2026-04-22 18:38:35.86826984 +0000 UTC m=+157.934482887" lastFinishedPulling="2026-04-22 18:38:38.215929038 +0000 UTC m=+160.282142092" observedRunningTime="2026-04-22 18:38:39.036818523 +0000 UTC m=+161.103031592" watchObservedRunningTime="2026-04-22 18:38:39.037906346 +0000 UTC m=+161.104119414" Apr 22 18:38:39.782991 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.782955 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:38:39.783203 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.783120 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:38:39.785383 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.785352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cc7162b6-24ec-432c-8317-6eff35ee7f87-metrics-tls\") pod \"dns-default-pfpxj\" (UID: \"cc7162b6-24ec-432c-8317-6eff35ee7f87\") " pod="openshift-dns/dns-default-pfpxj" Apr 22 18:38:39.785504 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.785408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"image-registry-57bd58d489-zrjpv\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:38:39.800109 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.800083 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4wrcz\"" Apr 22 18:38:39.801241 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.801218 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vvpgg\"" Apr 22 18:38:39.807662 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.807643 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pfpxj" Apr 22 18:38:39.807765 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.807711 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:38:39.883989 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.883959 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:38:39.886501 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.886471 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e29b790-9fe8-4412-be58-f5c6ee203578-cert\") pod \"ingress-canary-v26l6\" (UID: \"8e29b790-9fe8-4412-be58-f5c6ee203578\") " pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:38:39.939104 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.939072 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pfpxj"] Apr 22 18:38:39.941758 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:38:39.941728 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc7162b6_24ec_432c_8317_6eff35ee7f87.slice/crio-06dfdec88652f34f2b553924ff2212e0a2a36e1b231b9f1a17e7fa7036bc01c1 WatchSource:0}: Error finding container 06dfdec88652f34f2b553924ff2212e0a2a36e1b231b9f1a17e7fa7036bc01c1: Status 404 returned error can't find the container with id 06dfdec88652f34f2b553924ff2212e0a2a36e1b231b9f1a17e7fa7036bc01c1 Apr 22 18:38:39.956151 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:39.956122 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57bd58d489-zrjpv"] Apr 22 18:38:39.958885 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:38:39.958843 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55f34127_b675_4740_a1a6_5c477c79f2f8.slice/crio-44ff1bec1ec6fb79c2c231c553741260c6988162401a5270a300610f2cb0cbf7 WatchSource:0}: Error finding container 44ff1bec1ec6fb79c2c231c553741260c6988162401a5270a300610f2cb0cbf7: Status 404 returned error can't find the container with id 44ff1bec1ec6fb79c2c231c553741260c6988162401a5270a300610f2cb0cbf7 Apr 22 18:38:40.016197 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:40.016158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pfpxj" event={"ID":"cc7162b6-24ec-432c-8317-6eff35ee7f87","Type":"ContainerStarted","Data":"06dfdec88652f34f2b553924ff2212e0a2a36e1b231b9f1a17e7fa7036bc01c1"} Apr 22 18:38:40.017197 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:40.017173 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" event={"ID":"55f34127-b675-4740-a1a6-5c477c79f2f8","Type":"ContainerStarted","Data":"44ff1bec1ec6fb79c2c231c553741260c6988162401a5270a300610f2cb0cbf7"} Apr 22 18:38:40.100275 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:40.100248 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9qw5s\"" Apr 22 18:38:40.107551 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:40.107530 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v26l6" Apr 22 18:38:40.224311 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:40.224281 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v26l6"] Apr 22 18:38:40.227857 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:38:40.227825 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e29b790_9fe8_4412_be58_f5c6ee203578.slice/crio-13600215412d7215d8a9db4e87edc0d7e4b1f88907de3e2bbe6b19f4ff0a40bc WatchSource:0}: Error finding container 13600215412d7215d8a9db4e87edc0d7e4b1f88907de3e2bbe6b19f4ff0a40bc: Status 404 returned error can't find the container with id 13600215412d7215d8a9db4e87edc0d7e4b1f88907de3e2bbe6b19f4ff0a40bc Apr 22 18:38:41.021960 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:41.021823 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" event={"ID":"55f34127-b675-4740-a1a6-5c477c79f2f8","Type":"ContainerStarted","Data":"0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37"} Apr 22 18:38:41.021960 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:41.021906 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:38:41.023037 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:41.023002 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v26l6" event={"ID":"8e29b790-9fe8-4412-be58-f5c6ee203578","Type":"ContainerStarted","Data":"13600215412d7215d8a9db4e87edc0d7e4b1f88907de3e2bbe6b19f4ff0a40bc"} Apr 22 18:38:41.049951 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:41.049895 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" podStartSLOduration=163.049857066 podStartE2EDuration="2m43.049857066s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:38:41.048194641 +0000 UTC m=+163.114407711" watchObservedRunningTime="2026-04-22 18:38:41.049857066 +0000 UTC m=+163.116070139" Apr 22 18:38:42.723350 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:42.723222 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" podUID="ff344b43-41d5-4c2c-959c-55b5f3eb6e0d" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 22 18:38:43.030174 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.030084 2571 generic.go:358] "Generic (PLEG): container finished" podID="a10495a7-5071-4ef9-bc74-dad139a44eb8" containerID="b9f70acca840a7173c2764791b8040b6589a993fad8a1fda2fa39df1201127ea" exitCode=255 Apr 22 18:38:43.030174 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.030160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" event={"ID":"a10495a7-5071-4ef9-bc74-dad139a44eb8","Type":"ContainerDied","Data":"b9f70acca840a7173c2764791b8040b6589a993fad8a1fda2fa39df1201127ea"} Apr 22 18:38:43.030531 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.030513 2571 scope.go:117] "RemoveContainer" containerID="b9f70acca840a7173c2764791b8040b6589a993fad8a1fda2fa39df1201127ea" Apr 22 18:38:43.031460 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.031437 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v26l6" event={"ID":"8e29b790-9fe8-4412-be58-f5c6ee203578","Type":"ContainerStarted","Data":"e8c15544ea2a737ace3e4d3e759838fcca4d324b85ad19d203fa4719b7ba00b6"} Apr 22 18:38:43.032629 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.032607 2571 generic.go:358] "Generic (PLEG): container finished" podID="ff344b43-41d5-4c2c-959c-55b5f3eb6e0d" containerID="d2dafc89c92797f5a696eab2aa43e42425e40ba7b2c8905513bb9b2a99037b05" exitCode=1 Apr 22 18:38:43.032714 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.032673 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" event={"ID":"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d","Type":"ContainerDied","Data":"d2dafc89c92797f5a696eab2aa43e42425e40ba7b2c8905513bb9b2a99037b05"} Apr 22 18:38:43.033035 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.033023 2571 scope.go:117] "RemoveContainer" containerID="d2dafc89c92797f5a696eab2aa43e42425e40ba7b2c8905513bb9b2a99037b05" Apr 22 18:38:43.034326 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.034308 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pfpxj" event={"ID":"cc7162b6-24ec-432c-8317-6eff35ee7f87","Type":"ContainerStarted","Data":"c8175dc55b0e18bd697dd8734204fc9ef1b879620f3773657e9dd94925761c16"} Apr 22 18:38:43.034386 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.034332 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pfpxj" event={"ID":"cc7162b6-24ec-432c-8317-6eff35ee7f87","Type":"ContainerStarted","Data":"754e7800cb01c8add19222b6c8fa7eab373d7b956322916c33a4b5513d2d54fe"} Apr 22 18:38:43.034469 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.034453 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pfpxj" Apr 22 18:38:43.064920 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.064855 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pfpxj" podStartSLOduration=129.900371487 podStartE2EDuration="2m12.064842373s" podCreationTimestamp="2026-04-22 18:36:31 +0000 UTC" firstStartedPulling="2026-04-22 18:38:39.943601535 +0000 UTC m=+162.009814583" lastFinishedPulling="2026-04-22 18:38:42.108072407 +0000 UTC m=+164.174285469" observedRunningTime="2026-04-22 18:38:43.064192859 +0000 UTC m=+165.130405928" watchObservedRunningTime="2026-04-22 18:38:43.064842373 +0000 UTC m=+165.131055442" Apr 22 18:38:43.080594 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:43.080550 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v26l6" podStartSLOduration=130.200114379 podStartE2EDuration="2m12.080536905s" podCreationTimestamp="2026-04-22 18:36:31 +0000 UTC" firstStartedPulling="2026-04-22 18:38:40.229881654 +0000 UTC m=+162.296094716" lastFinishedPulling="2026-04-22 18:38:42.110304194 +0000 UTC m=+164.176517242" observedRunningTime="2026-04-22 18:38:43.079667717 +0000 UTC m=+165.145880790" watchObservedRunningTime="2026-04-22 18:38:43.080536905 +0000 UTC m=+165.146749971" Apr 22 18:38:44.038537 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:44.038500 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b4b69d795-pw6qs" event={"ID":"a10495a7-5071-4ef9-bc74-dad139a44eb8","Type":"ContainerStarted","Data":"70a5cb41ecc46d10c5560a640c1cd987722edbb07f584e7de874c32ad3ad4692"} Apr 22 18:38:44.040018 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:44.039993 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" event={"ID":"ff344b43-41d5-4c2c-959c-55b5f3eb6e0d","Type":"ContainerStarted","Data":"12158d8468194c7790e7567625a226e500e2cead59403fc055ee5e4d63b6fc2d"} Apr 22 18:38:44.040469 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:44.040453 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:38:44.041060 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:44.041043 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84c68797fc-nq28f" Apr 22 18:38:47.255660 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.255581 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hn449"] Apr 22 18:38:47.258575 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.258558 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.262391 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.262357 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:38:47.262547 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.262518 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:38:47.262795 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.262779 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:38:47.262795 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.262787 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:38:47.262978 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.262787 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:38:47.263755 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.263734 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:38:47.264372 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.264356 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bc9vl\"" Apr 22 18:38:47.342852 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.342817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-root\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.342852 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.342856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.343118 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.342897 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpwwk\" (UniqueName: \"kubernetes.io/projected/08418377-4601-42e1-a730-06d7b3fedf8c-kube-api-access-zpwwk\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.343118 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.342972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-tls\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.343118 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.343016 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-wtmp\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.343118 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.343047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08418377-4601-42e1-a730-06d7b3fedf8c-metrics-client-ca\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.343118 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.343093 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-accelerators-collector-config\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.343325 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.343147 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-textfile\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.343325 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.343184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-sys\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443699 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.443664 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-root\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443699 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.443705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443992 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.443724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpwwk\" (UniqueName: \"kubernetes.io/projected/08418377-4601-42e1-a730-06d7b3fedf8c-kube-api-access-zpwwk\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443992 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.443746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-tls\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443992 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.443778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-wtmp\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443992 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.443782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-root\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443992 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:38:47.443882 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:38:47.443992 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.443920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08418377-4601-42e1-a730-06d7b3fedf8c-metrics-client-ca\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443992 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.443939 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-wtmp\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.443992 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:38:47.443952 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-tls podName:08418377-4601-42e1-a730-06d7b3fedf8c nodeName:}" failed. No retries permitted until 2026-04-22 18:38:47.943935006 +0000 UTC m=+170.010148052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-tls") pod "node-exporter-hn449" (UID: "08418377-4601-42e1-a730-06d7b3fedf8c") : secret "node-exporter-tls" not found Apr 22 18:38:47.444329 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.444009 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-accelerators-collector-config\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.444329 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.444060 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-textfile\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.444329 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.444100 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-sys\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.444329 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.444174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08418377-4601-42e1-a730-06d7b3fedf8c-sys\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.444464 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.444416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-textfile\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.444464 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.444440 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/08418377-4601-42e1-a730-06d7b3fedf8c-metrics-client-ca\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.444528 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.444486 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-accelerators-collector-config\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.446072 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.446053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.455206 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.455184 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpwwk\" (UniqueName: \"kubernetes.io/projected/08418377-4601-42e1-a730-06d7b3fedf8c-kube-api-access-zpwwk\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.948812 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.948772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-tls\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:47.951033 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:47.951000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/08418377-4601-42e1-a730-06d7b3fedf8c-node-exporter-tls\") pod \"node-exporter-hn449\" (UID: \"08418377-4601-42e1-a730-06d7b3fedf8c\") " pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:48.168529 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:48.168488 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hn449" Apr 22 18:38:48.178847 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:38:48.178817 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08418377_4601_42e1_a730_06d7b3fedf8c.slice/crio-da5059f369594536865fc6be1f1351e30b183df4e17bd9d3847f9ea23f90016c WatchSource:0}: Error finding container da5059f369594536865fc6be1f1351e30b183df4e17bd9d3847f9ea23f90016c: Status 404 returned error can't find the container with id da5059f369594536865fc6be1f1351e30b183df4e17bd9d3847f9ea23f90016c Apr 22 18:38:49.054320 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:49.054284 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hn449" event={"ID":"08418377-4601-42e1-a730-06d7b3fedf8c","Type":"ContainerStarted","Data":"da5059f369594536865fc6be1f1351e30b183df4e17bd9d3847f9ea23f90016c"} Apr 22 18:38:49.508955 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:49.508915 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:38:50.058131 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:50.058090 2571 generic.go:358] "Generic (PLEG): container finished" podID="08418377-4601-42e1-a730-06d7b3fedf8c" containerID="1d1c78c03a335b7c7fb5b83d6791cd054d5949a2b1adc911707383500348ae11" exitCode=0 Apr 22 18:38:50.058613 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:50.058158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hn449" event={"ID":"08418377-4601-42e1-a730-06d7b3fedf8c","Type":"ContainerDied","Data":"1d1c78c03a335b7c7fb5b83d6791cd054d5949a2b1adc911707383500348ae11"} Apr 22 18:38:51.061920 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:51.061862 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hn449" event={"ID":"08418377-4601-42e1-a730-06d7b3fedf8c","Type":"ContainerStarted","Data":"e26df3c11c42a8b67e9a69bf742fd47499a47138832e40eb6be9c6aecd061c3b"} Apr 22 18:38:51.061920 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:51.061923 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hn449" event={"ID":"08418377-4601-42e1-a730-06d7b3fedf8c","Type":"ContainerStarted","Data":"dede4e6764c512d78c7027aef441badb1f043bc5e4f5f09ec8328227019298ad"} Apr 22 18:38:51.086289 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:51.086239 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hn449" podStartSLOduration=3.129276079 podStartE2EDuration="4.086222845s" podCreationTimestamp="2026-04-22 18:38:47 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.180813532 +0000 UTC m=+170.247026578" lastFinishedPulling="2026-04-22 18:38:49.137760295 +0000 UTC m=+171.203973344" observedRunningTime="2026-04-22 18:38:51.084272843 +0000 UTC m=+173.150485911" watchObservedRunningTime="2026-04-22 18:38:51.086222845 +0000 UTC m=+173.152435914" Apr 22 18:38:53.042309 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:53.042276 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pfpxj" Apr 22 18:38:57.279437 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:57.279398 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57bd58d489-zrjpv"] Apr 22 18:38:57.283357 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:57.283329 2571 patch_prober.go:28] interesting pod/image-registry-57bd58d489-zrjpv container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:38:57.283490 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:38:57.283379 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" podUID="55f34127-b675-4740-a1a6-5c477c79f2f8" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:39:07.284419 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:07.284388 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:39:22.298444 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.298369 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" podUID="55f34127-b675-4740-a1a6-5c477c79f2f8" containerName="registry" containerID="cri-o://0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37" gracePeriod=30 Apr 22 18:39:22.543191 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.543166 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:39:22.615567 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.615531 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55f34127-b675-4740-a1a6-5c477c79f2f8-ca-trust-extracted\") pod \"55f34127-b675-4740-a1a6-5c477c79f2f8\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " Apr 22 18:39:22.615758 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.615573 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-installation-pull-secrets\") pod \"55f34127-b675-4740-a1a6-5c477c79f2f8\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " Apr 22 18:39:22.615758 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.615605 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-bound-sa-token\") pod \"55f34127-b675-4740-a1a6-5c477c79f2f8\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " Apr 22 18:39:22.615758 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.615624 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-certificates\") pod \"55f34127-b675-4740-a1a6-5c477c79f2f8\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " Apr 22 18:39:22.615758 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.615650 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") pod \"55f34127-b675-4740-a1a6-5c477c79f2f8\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " Apr 22 18:39:22.615758 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.615683 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-image-registry-private-configuration\") pod \"55f34127-b675-4740-a1a6-5c477c79f2f8\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " Apr 22 18:39:22.615758 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.615718 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-trusted-ca\") pod \"55f34127-b675-4740-a1a6-5c477c79f2f8\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " Apr 22 18:39:22.616103 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.615765 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vc7z\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-kube-api-access-2vc7z\") pod \"55f34127-b675-4740-a1a6-5c477c79f2f8\" (UID: \"55f34127-b675-4740-a1a6-5c477c79f2f8\") " Apr 22 18:39:22.616682 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.616624 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "55f34127-b675-4740-a1a6-5c477c79f2f8" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:22.617060 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.617020 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "55f34127-b675-4740-a1a6-5c477c79f2f8" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:22.618671 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.618645 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "55f34127-b675-4740-a1a6-5c477c79f2f8" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:22.618776 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.618677 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "55f34127-b675-4740-a1a6-5c477c79f2f8" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:22.618776 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.618744 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-kube-api-access-2vc7z" (OuterVolumeSpecName: "kube-api-access-2vc7z") pod "55f34127-b675-4740-a1a6-5c477c79f2f8" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8"). InnerVolumeSpecName "kube-api-access-2vc7z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:22.618776 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.618753 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "55f34127-b675-4740-a1a6-5c477c79f2f8" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:22.618776 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.618766 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "55f34127-b675-4740-a1a6-5c477c79f2f8" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:22.624694 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.624669 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55f34127-b675-4740-a1a6-5c477c79f2f8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "55f34127-b675-4740-a1a6-5c477c79f2f8" (UID: "55f34127-b675-4740-a1a6-5c477c79f2f8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:39:22.716670 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.716633 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-bound-sa-token\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:39:22.716670 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.716663 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-certificates\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:39:22.716670 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.716675 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-registry-tls\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:39:22.716942 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.716686 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-image-registry-private-configuration\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:39:22.716942 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.716696 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55f34127-b675-4740-a1a6-5c477c79f2f8-trusted-ca\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:39:22.716942 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.716704 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vc7z\" (UniqueName: \"kubernetes.io/projected/55f34127-b675-4740-a1a6-5c477c79f2f8-kube-api-access-2vc7z\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:39:22.716942 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.716713 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/55f34127-b675-4740-a1a6-5c477c79f2f8-ca-trust-extracted\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:39:22.716942 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:22.716722 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/55f34127-b675-4740-a1a6-5c477c79f2f8-installation-pull-secrets\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:39:23.147513 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.147473 2571 generic.go:358] "Generic (PLEG): container finished" podID="55f34127-b675-4740-a1a6-5c477c79f2f8" containerID="0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37" exitCode=0 Apr 22 18:39:23.147688 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.147551 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" Apr 22 18:39:23.147688 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.147549 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" event={"ID":"55f34127-b675-4740-a1a6-5c477c79f2f8","Type":"ContainerDied","Data":"0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37"} Apr 22 18:39:23.147688 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.147655 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57bd58d489-zrjpv" event={"ID":"55f34127-b675-4740-a1a6-5c477c79f2f8","Type":"ContainerDied","Data":"44ff1bec1ec6fb79c2c231c553741260c6988162401a5270a300610f2cb0cbf7"} Apr 22 18:39:23.147688 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.147675 2571 scope.go:117] "RemoveContainer" containerID="0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37" Apr 22 18:39:23.155696 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.155671 2571 scope.go:117] "RemoveContainer" containerID="0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37" Apr 22 18:39:23.156037 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:39:23.156017 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37\": container with ID starting with 0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37 not found: ID does not exist" containerID="0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37" Apr 22 18:39:23.156110 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.156045 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37"} err="failed to get container status \"0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37\": rpc error: code = NotFound desc = could not find container \"0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37\": container with ID starting with 0e47823a0678b81dc5b883d1a7f1fea8a1d722e5a6c8d948d4684907495ffd37 not found: ID does not exist" Apr 22 18:39:23.171310 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.171280 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57bd58d489-zrjpv"] Apr 22 18:39:23.176406 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:23.176380 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57bd58d489-zrjpv"] Apr 22 18:39:24.512326 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:24.512286 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55f34127-b675-4740-a1a6-5c477c79f2f8" path="/var/lib/kubelet/pods/55f34127-b675-4740-a1a6-5c477c79f2f8/volumes" Apr 22 18:39:30.040315 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:30.040285 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hn449_08418377-4601-42e1-a730-06d7b3fedf8c/init-textfile/0.log" Apr 22 18:39:30.240548 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:30.240519 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hn449_08418377-4601-42e1-a730-06d7b3fedf8c/node-exporter/0.log" Apr 22 18:39:30.440449 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:30.440421 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hn449_08418377-4601-42e1-a730-06d7b3fedf8c/kube-rbac-proxy/0.log" Apr 22 18:39:37.440302 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:37.440267 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-v26l6_8e29b790-9fe8-4412-be58-f5c6ee203578/serve-healthcheck-canary/0.log" Apr 22 18:39:42.161326 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:42.161287 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" podUID="432cf752-8129-4b79-956e-b18dc2cdebbb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:39:52.161156 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:39:52.161113 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" podUID="432cf752-8129-4b79-956e-b18dc2cdebbb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:40:02.161153 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:02.161108 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" podUID="432cf752-8129-4b79-956e-b18dc2cdebbb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:40:02.161649 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:02.161194 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" Apr 22 18:40:02.161806 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:02.161783 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"0628c8eca10b38a9d8fc0b56c2247aa7dc97fec3b60ee8d531b251940740573a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:40:02.161892 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:02.161833 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" podUID="432cf752-8129-4b79-956e-b18dc2cdebbb" containerName="service-proxy" containerID="cri-o://0628c8eca10b38a9d8fc0b56c2247aa7dc97fec3b60ee8d531b251940740573a" gracePeriod=30 Apr 22 18:40:03.258103 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:03.258070 2571 generic.go:358] "Generic (PLEG): container finished" podID="432cf752-8129-4b79-956e-b18dc2cdebbb" containerID="0628c8eca10b38a9d8fc0b56c2247aa7dc97fec3b60ee8d531b251940740573a" exitCode=2 Apr 22 18:40:03.258473 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:03.258122 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" event={"ID":"432cf752-8129-4b79-956e-b18dc2cdebbb","Type":"ContainerDied","Data":"0628c8eca10b38a9d8fc0b56c2247aa7dc97fec3b60ee8d531b251940740573a"} Apr 22 18:40:03.258473 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:03.258148 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-844dbf5558-hhvkc" event={"ID":"432cf752-8129-4b79-956e-b18dc2cdebbb","Type":"ContainerStarted","Data":"08856cc82887b000f0d1b5fa5c9322307df3e6cc7e1ce4bb1fc36fc9f7509618"} Apr 22 18:40:10.359908 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:10.359830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:40:10.362224 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:10.362203 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7ff729b2-3f8f-4665-ae12-c83c4b179998-metrics-certs\") pod \"network-metrics-daemon-8llv8\" (UID: \"7ff729b2-3f8f-4665-ae12-c83c4b179998\") " pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:40:10.512530 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:10.512499 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfh6w\"" Apr 22 18:40:10.520110 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:10.520081 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8llv8" Apr 22 18:40:10.638418 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:10.638331 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8llv8"] Apr 22 18:40:10.641478 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:40:10.641445 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff729b2_3f8f_4665_ae12_c83c4b179998.slice/crio-fd5e30d25c981ae336e8442724bd907c7f6d3f60437195d3d149b822d182ac41 WatchSource:0}: Error finding container fd5e30d25c981ae336e8442724bd907c7f6d3f60437195d3d149b822d182ac41: Status 404 returned error can't find the container with id fd5e30d25c981ae336e8442724bd907c7f6d3f60437195d3d149b822d182ac41 Apr 22 18:40:11.280748 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:11.280707 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8llv8" event={"ID":"7ff729b2-3f8f-4665-ae12-c83c4b179998","Type":"ContainerStarted","Data":"fd5e30d25c981ae336e8442724bd907c7f6d3f60437195d3d149b822d182ac41"} Apr 22 18:40:12.285434 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:12.285399 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8llv8" event={"ID":"7ff729b2-3f8f-4665-ae12-c83c4b179998","Type":"ContainerStarted","Data":"164927fecc94e08f7a6009f52ad7ab70e60ec70ce9d06d7b6bb34dadc8107f5c"} Apr 22 18:40:12.285434 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:12.285434 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8llv8" event={"ID":"7ff729b2-3f8f-4665-ae12-c83c4b179998","Type":"ContainerStarted","Data":"53c378f76dd9241bddf0880b706325d22d0240672dbfc2bd060c87911e225598"} Apr 22 18:40:12.304210 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:12.304162 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8llv8" podStartSLOduration=253.346949896 podStartE2EDuration="4m14.304146505s" podCreationTimestamp="2026-04-22 18:35:58 +0000 UTC" firstStartedPulling="2026-04-22 18:40:10.643292951 +0000 UTC m=+252.709505999" lastFinishedPulling="2026-04-22 18:40:11.600489561 +0000 UTC m=+253.666702608" observedRunningTime="2026-04-22 18:40:12.303531072 +0000 UTC m=+254.369744138" watchObservedRunningTime="2026-04-22 18:40:12.304146505 +0000 UTC m=+254.370359585" Apr 22 18:40:58.426539 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:40:58.426509 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:42:59.108083 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.108050 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq"] Apr 22 18:42:59.108581 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.108323 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55f34127-b675-4740-a1a6-5c477c79f2f8" containerName="registry" Apr 22 18:42:59.108581 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.108335 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="55f34127-b675-4740-a1a6-5c477c79f2f8" containerName="registry" Apr 22 18:42:59.108581 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.108380 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="55f34127-b675-4740-a1a6-5c477c79f2f8" containerName="registry" Apr 22 18:42:59.109994 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.109976 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.116214 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.116118 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:42:59.116287 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.116211 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:42:59.116287 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.116236 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:42:59.116287 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.116245 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 18:42:59.116441 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.116364 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:42:59.117224 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.117206 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-kvsvw\"" Apr 22 18:42:59.125103 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.125084 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq"] Apr 22 18:42:59.172733 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.172704 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.172897 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.172740 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.172897 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.172768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fss56\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-kube-api-access-fss56\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.273785 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.273744 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fss56\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-kube-api-access-fss56\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.273955 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.273847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.273955 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.273898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.274039 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:42:59.273995 2571 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:42:59.274039 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:42:59.274009 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:42:59.274039 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:42:59.274030 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq: references non-existent secret key: tls.crt Apr 22 18:42:59.274143 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:42:59.274100 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates podName:3ac8ca82-4f21-44a1-876f-d7da5c42b0c0 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:59.774081509 +0000 UTC m=+421.840294570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates") pod "keda-metrics-apiserver-7c9f485588-dg4gq" (UID: "3ac8ca82-4f21-44a1-876f-d7da5c42b0c0") : references non-existent secret key: tls.crt Apr 22 18:42:59.274211 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.274181 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.283008 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.282989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fss56\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-kube-api-access-fss56\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.778505 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:42:59.778474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:42:59.778670 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:42:59.778581 2571 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:42:59.778670 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:42:59.778594 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:42:59.778670 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:42:59.778612 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq: references non-existent secret key: tls.crt Apr 22 18:42:59.778670 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:42:59.778665 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates podName:3ac8ca82-4f21-44a1-876f-d7da5c42b0c0 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:00.778651943 +0000 UTC m=+422.844864994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates") pod "keda-metrics-apiserver-7c9f485588-dg4gq" (UID: "3ac8ca82-4f21-44a1-876f-d7da5c42b0c0") : references non-existent secret key: tls.crt Apr 22 18:43:00.785621 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:00.785586 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:43:00.785994 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:43:00.785697 2571 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:43:00.785994 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:43:00.785708 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:43:00.785994 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:43:00.785726 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq: references non-existent secret key: tls.crt Apr 22 18:43:00.785994 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:43:00.785783 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates podName:3ac8ca82-4f21-44a1-876f-d7da5c42b0c0 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:02.785768178 +0000 UTC m=+424.851981225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates") pod "keda-metrics-apiserver-7c9f485588-dg4gq" (UID: "3ac8ca82-4f21-44a1-876f-d7da5c42b0c0") : references non-existent secret key: tls.crt Apr 22 18:43:02.801110 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:02.801079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:43:02.801487 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:43:02.801191 2571 secret.go:281] references non-existent secret key: tls.crt Apr 22 18:43:02.801487 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:43:02.801202 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 18:43:02.801487 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:43:02.801220 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq: references non-existent secret key: tls.crt Apr 22 18:43:02.801487 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:43:02.801272 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates podName:3ac8ca82-4f21-44a1-876f-d7da5c42b0c0 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:06.801259797 +0000 UTC m=+428.867472845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates") pod "keda-metrics-apiserver-7c9f485588-dg4gq" (UID: "3ac8ca82-4f21-44a1-876f-d7da5c42b0c0") : references non-existent secret key: tls.crt Apr 22 18:43:06.830467 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:06.830433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:43:06.832913 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:06.832891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3ac8ca82-4f21-44a1-876f-d7da5c42b0c0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-dg4gq\" (UID: \"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:43:06.919028 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:06.918972 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:43:07.041102 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:07.041076 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq"] Apr 22 18:43:07.043350 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:43:07.043317 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ac8ca82_4f21_44a1_876f_d7da5c42b0c0.slice/crio-f442e2e82f3217c82cd15bff6dd392a0c4e22ef60b38a2d98ed754e0aac6ae33 WatchSource:0}: Error finding container f442e2e82f3217c82cd15bff6dd392a0c4e22ef60b38a2d98ed754e0aac6ae33: Status 404 returned error can't find the container with id f442e2e82f3217c82cd15bff6dd392a0c4e22ef60b38a2d98ed754e0aac6ae33 Apr 22 18:43:07.044548 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:07.044528 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:43:07.717588 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:07.717554 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" event={"ID":"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0","Type":"ContainerStarted","Data":"f442e2e82f3217c82cd15bff6dd392a0c4e22ef60b38a2d98ed754e0aac6ae33"} Apr 22 18:43:11.729673 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:11.729587 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" event={"ID":"3ac8ca82-4f21-44a1-876f-d7da5c42b0c0","Type":"ContainerStarted","Data":"f4101cdea727312bbb240465d4cc68f66bc4bf447bfc9340e4633f1e836cc280"} Apr 22 18:43:11.730054 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:11.729702 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:43:11.748226 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:11.748169 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" podStartSLOduration=8.316216895 podStartE2EDuration="12.748148257s" podCreationTimestamp="2026-04-22 18:42:59 +0000 UTC" firstStartedPulling="2026-04-22 18:43:07.04468862 +0000 UTC m=+429.110901667" lastFinishedPulling="2026-04-22 18:43:11.476619981 +0000 UTC m=+433.542833029" observedRunningTime="2026-04-22 18:43:11.747223393 +0000 UTC m=+433.813436463" watchObservedRunningTime="2026-04-22 18:43:11.748148257 +0000 UTC m=+433.814361329" Apr 22 18:43:22.736784 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:43:22.736704 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-dg4gq" Apr 22 18:44:04.201736 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.201704 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qs4w8"] Apr 22 18:44:04.203636 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.203621 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:04.206359 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.206338 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:44:04.206359 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.206356 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:44:04.207336 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.207322 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:44:04.207396 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.207380 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-blsfw\"" Apr 22 18:44:04.213633 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.213611 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546xt\" (UniqueName: \"kubernetes.io/projected/2a670dea-62e9-43f2-b144-893e7e2c1040-kube-api-access-546xt\") pod \"kserve-controller-manager-d9c56dd68-qs4w8\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:04.213739 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.213667 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert\") pod \"kserve-controller-manager-d9c56dd68-qs4w8\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:04.215182 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.215162 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qs4w8"] Apr 22 18:44:04.257142 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.257123 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-4phhd"] Apr 22 18:44:04.258987 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.258973 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:04.261597 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.261577 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:44:04.261916 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.261898 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-8mrsg\"" Apr 22 18:44:04.271773 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.271753 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-4phhd"] Apr 22 18:44:04.313977 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.313955 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert\") pod \"kserve-controller-manager-d9c56dd68-qs4w8\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:04.314075 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.313997 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6bpx\" (UniqueName: \"kubernetes.io/projected/298c057f-f82b-4451-87f2-784d2bc3602d-kube-api-access-s6bpx\") pod \"seaweedfs-86cc847c5c-4phhd\" (UID: \"298c057f-f82b-4451-87f2-784d2bc3602d\") " pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:04.314075 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.314023 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-546xt\" (UniqueName: \"kubernetes.io/projected/2a670dea-62e9-43f2-b144-893e7e2c1040-kube-api-access-546xt\") pod \"kserve-controller-manager-d9c56dd68-qs4w8\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:04.314075 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.314045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/298c057f-f82b-4451-87f2-784d2bc3602d-data\") pod \"seaweedfs-86cc847c5c-4phhd\" (UID: \"298c057f-f82b-4451-87f2-784d2bc3602d\") " pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:04.314186 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:44:04.314112 2571 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 18:44:04.314186 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:44:04.314167 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert podName:2a670dea-62e9-43f2-b144-893e7e2c1040 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:04.814150812 +0000 UTC m=+486.880363870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert") pod "kserve-controller-manager-d9c56dd68-qs4w8" (UID: "2a670dea-62e9-43f2-b144-893e7e2c1040") : secret "kserve-webhook-server-cert" not found Apr 22 18:44:04.323353 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.323330 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-546xt\" (UniqueName: \"kubernetes.io/projected/2a670dea-62e9-43f2-b144-893e7e2c1040-kube-api-access-546xt\") pod \"kserve-controller-manager-d9c56dd68-qs4w8\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:04.415176 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.415153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6bpx\" (UniqueName: \"kubernetes.io/projected/298c057f-f82b-4451-87f2-784d2bc3602d-kube-api-access-s6bpx\") pod \"seaweedfs-86cc847c5c-4phhd\" (UID: \"298c057f-f82b-4451-87f2-784d2bc3602d\") " pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:04.415298 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.415198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/298c057f-f82b-4451-87f2-784d2bc3602d-data\") pod \"seaweedfs-86cc847c5c-4phhd\" (UID: \"298c057f-f82b-4451-87f2-784d2bc3602d\") " pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:04.415520 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.415501 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/298c057f-f82b-4451-87f2-784d2bc3602d-data\") pod \"seaweedfs-86cc847c5c-4phhd\" (UID: \"298c057f-f82b-4451-87f2-784d2bc3602d\") " pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:04.423911 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.423890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6bpx\" (UniqueName: \"kubernetes.io/projected/298c057f-f82b-4451-87f2-784d2bc3602d-kube-api-access-s6bpx\") pod \"seaweedfs-86cc847c5c-4phhd\" (UID: \"298c057f-f82b-4451-87f2-784d2bc3602d\") " pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:04.567090 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.567023 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:04.679804 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.679775 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-4phhd"] Apr 22 18:44:04.682605 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:44:04.682583 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod298c057f_f82b_4451_87f2_784d2bc3602d.slice/crio-43445145e8facbb9974331d58ed4e3e7fbe1b8905ac09b9803b3d626b8719c65 WatchSource:0}: Error finding container 43445145e8facbb9974331d58ed4e3e7fbe1b8905ac09b9803b3d626b8719c65: Status 404 returned error can't find the container with id 43445145e8facbb9974331d58ed4e3e7fbe1b8905ac09b9803b3d626b8719c65 Apr 22 18:44:04.819085 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.819013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert\") pod \"kserve-controller-manager-d9c56dd68-qs4w8\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:04.821245 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.821219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert\") pod \"kserve-controller-manager-d9c56dd68-qs4w8\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:04.865147 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:04.865120 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-4phhd" event={"ID":"298c057f-f82b-4451-87f2-784d2bc3602d","Type":"ContainerStarted","Data":"43445145e8facbb9974331d58ed4e3e7fbe1b8905ac09b9803b3d626b8719c65"} Apr 22 18:44:05.113359 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:05.113326 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:05.252155 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:05.252092 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qs4w8"] Apr 22 18:44:05.281813 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:44:05.281762 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a670dea_62e9_43f2_b144_893e7e2c1040.slice/crio-471b3863c0600b9f49d8fc435750485f185188bb84649b80a390d11e81cbff4c WatchSource:0}: Error finding container 471b3863c0600b9f49d8fc435750485f185188bb84649b80a390d11e81cbff4c: Status 404 returned error can't find the container with id 471b3863c0600b9f49d8fc435750485f185188bb84649b80a390d11e81cbff4c Apr 22 18:44:05.870082 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:05.870041 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" event={"ID":"2a670dea-62e9-43f2-b144-893e7e2c1040","Type":"ContainerStarted","Data":"471b3863c0600b9f49d8fc435750485f185188bb84649b80a390d11e81cbff4c"} Apr 22 18:44:08.878908 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:08.878858 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-4phhd" event={"ID":"298c057f-f82b-4451-87f2-784d2bc3602d","Type":"ContainerStarted","Data":"acae148dbc22ebf4a815d711717cd7257b5d7f29873204e659af495b938503b4"} Apr 22 18:44:08.879324 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:08.879017 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:08.880127 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:08.880106 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" event={"ID":"2a670dea-62e9-43f2-b144-893e7e2c1040","Type":"ContainerStarted","Data":"6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06"} Apr 22 18:44:08.880236 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:08.880224 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:08.897920 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:08.897854 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-4phhd" podStartSLOduration=0.974899491 podStartE2EDuration="4.89784308s" podCreationTimestamp="2026-04-22 18:44:04 +0000 UTC" firstStartedPulling="2026-04-22 18:44:04.683919184 +0000 UTC m=+486.750132235" lastFinishedPulling="2026-04-22 18:44:08.606862775 +0000 UTC m=+490.673075824" observedRunningTime="2026-04-22 18:44:08.896822455 +0000 UTC m=+490.963035525" watchObservedRunningTime="2026-04-22 18:44:08.89784308 +0000 UTC m=+490.964056149" Apr 22 18:44:08.912979 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:08.912932 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" podStartSLOduration=1.641388521 podStartE2EDuration="4.912921475s" podCreationTimestamp="2026-04-22 18:44:04 +0000 UTC" firstStartedPulling="2026-04-22 18:44:05.28318493 +0000 UTC m=+487.349397977" lastFinishedPulling="2026-04-22 18:44:08.554717879 +0000 UTC m=+490.620930931" observedRunningTime="2026-04-22 18:44:08.912556466 +0000 UTC m=+490.978769537" watchObservedRunningTime="2026-04-22 18:44:08.912921475 +0000 UTC m=+490.979134543" Apr 22 18:44:14.886049 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:14.886018 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-4phhd" Apr 22 18:44:39.888878 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:39.888840 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:40.528535 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.528503 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qs4w8"] Apr 22 18:44:40.528731 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.528694 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" podUID="2a670dea-62e9-43f2-b144-893e7e2c1040" containerName="manager" containerID="cri-o://6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06" gracePeriod=10 Apr 22 18:44:40.552305 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.552282 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-w6gpc"] Apr 22 18:44:40.555615 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.555601 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:40.565622 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.565596 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-w6gpc"] Apr 22 18:44:40.663719 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.663686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs2sr\" (UniqueName: \"kubernetes.io/projected/990031f4-a957-446a-8717-b366a43ef49a-kube-api-access-rs2sr\") pod \"kserve-controller-manager-d9c56dd68-w6gpc\" (UID: \"990031f4-a957-446a-8717-b366a43ef49a\") " pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:40.663845 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.663735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/990031f4-a957-446a-8717-b366a43ef49a-cert\") pod \"kserve-controller-manager-d9c56dd68-w6gpc\" (UID: \"990031f4-a957-446a-8717-b366a43ef49a\") " pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:40.752667 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.752646 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:40.764127 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.764110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs2sr\" (UniqueName: \"kubernetes.io/projected/990031f4-a957-446a-8717-b366a43ef49a-kube-api-access-rs2sr\") pod \"kserve-controller-manager-d9c56dd68-w6gpc\" (UID: \"990031f4-a957-446a-8717-b366a43ef49a\") " pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:40.764190 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.764147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/990031f4-a957-446a-8717-b366a43ef49a-cert\") pod \"kserve-controller-manager-d9c56dd68-w6gpc\" (UID: \"990031f4-a957-446a-8717-b366a43ef49a\") " pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:40.766290 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.766271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/990031f4-a957-446a-8717-b366a43ef49a-cert\") pod \"kserve-controller-manager-d9c56dd68-w6gpc\" (UID: \"990031f4-a957-446a-8717-b366a43ef49a\") " pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:40.774453 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.774431 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs2sr\" (UniqueName: \"kubernetes.io/projected/990031f4-a957-446a-8717-b366a43ef49a-kube-api-access-rs2sr\") pod \"kserve-controller-manager-d9c56dd68-w6gpc\" (UID: \"990031f4-a957-446a-8717-b366a43ef49a\") " pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:40.865055 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.865031 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert\") pod \"2a670dea-62e9-43f2-b144-893e7e2c1040\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " Apr 22 18:44:40.865188 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.865114 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-546xt\" (UniqueName: \"kubernetes.io/projected/2a670dea-62e9-43f2-b144-893e7e2c1040-kube-api-access-546xt\") pod \"2a670dea-62e9-43f2-b144-893e7e2c1040\" (UID: \"2a670dea-62e9-43f2-b144-893e7e2c1040\") " Apr 22 18:44:40.867106 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.867081 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert" (OuterVolumeSpecName: "cert") pod "2a670dea-62e9-43f2-b144-893e7e2c1040" (UID: "2a670dea-62e9-43f2-b144-893e7e2c1040"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:44:40.867209 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.867113 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a670dea-62e9-43f2-b144-893e7e2c1040-kube-api-access-546xt" (OuterVolumeSpecName: "kube-api-access-546xt") pod "2a670dea-62e9-43f2-b144-893e7e2c1040" (UID: "2a670dea-62e9-43f2-b144-893e7e2c1040"). InnerVolumeSpecName "kube-api-access-546xt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:44:40.915011 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.914982 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:40.961465 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.961429 2571 generic.go:358] "Generic (PLEG): container finished" podID="2a670dea-62e9-43f2-b144-893e7e2c1040" containerID="6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06" exitCode=0 Apr 22 18:44:40.961696 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.961674 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" Apr 22 18:44:40.962324 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.962135 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" event={"ID":"2a670dea-62e9-43f2-b144-893e7e2c1040","Type":"ContainerDied","Data":"6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06"} Apr 22 18:44:40.962324 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.962189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-qs4w8" event={"ID":"2a670dea-62e9-43f2-b144-893e7e2c1040","Type":"ContainerDied","Data":"471b3863c0600b9f49d8fc435750485f185188bb84649b80a390d11e81cbff4c"} Apr 22 18:44:40.962324 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.962211 2571 scope.go:117] "RemoveContainer" containerID="6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06" Apr 22 18:44:40.965754 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.965685 2571 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a670dea-62e9-43f2-b144-893e7e2c1040-cert\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:44:40.965754 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.965725 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-546xt\" (UniqueName: \"kubernetes.io/projected/2a670dea-62e9-43f2-b144-893e7e2c1040-kube-api-access-546xt\") on node \"ip-10-0-134-126.ec2.internal\" DevicePath \"\"" Apr 22 18:44:40.971755 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.971735 2571 scope.go:117] "RemoveContainer" containerID="6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06" Apr 22 18:44:40.972139 ip-10-0-134-126 kubenswrapper[2571]: E0422 18:44:40.972103 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06\": container with ID starting with 6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06 not found: ID does not exist" containerID="6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06" Apr 22 18:44:40.972269 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.972158 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06"} err="failed to get container status \"6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06\": rpc error: code = NotFound desc = could not find container \"6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06\": container with ID starting with 6a767e0221077957aca8aa5198bff4bc72fddef11c64bcaccb9727c429a22b06 not found: ID does not exist" Apr 22 18:44:40.990160 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.990003 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qs4w8"] Apr 22 18:44:40.994207 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:40.994185 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-qs4w8"] Apr 22 18:44:41.037240 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:41.037215 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-d9c56dd68-w6gpc"] Apr 22 18:44:41.040206 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:44:41.040179 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod990031f4_a957_446a_8717_b366a43ef49a.slice/crio-422a63790e48ffbcc1dd9a29dc031e6299615429aa81e6906322dbc050bc359c WatchSource:0}: Error finding container 422a63790e48ffbcc1dd9a29dc031e6299615429aa81e6906322dbc050bc359c: Status 404 returned error can't find the container with id 422a63790e48ffbcc1dd9a29dc031e6299615429aa81e6906322dbc050bc359c Apr 22 18:44:41.966113 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:41.966080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" event={"ID":"990031f4-a957-446a-8717-b366a43ef49a","Type":"ContainerStarted","Data":"26ea6375eaee3475e43e330fbce73b839a4cec05b8458b23f20b2ae31e8a7fb8"} Apr 22 18:44:41.966542 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:41.966118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" event={"ID":"990031f4-a957-446a-8717-b366a43ef49a","Type":"ContainerStarted","Data":"422a63790e48ffbcc1dd9a29dc031e6299615429aa81e6906322dbc050bc359c"} Apr 22 18:44:41.966542 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:41.966217 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:44:41.998062 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:41.998021 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" podStartSLOduration=1.427898169 podStartE2EDuration="1.998008026s" podCreationTimestamp="2026-04-22 18:44:40 +0000 UTC" firstStartedPulling="2026-04-22 18:44:41.04142654 +0000 UTC m=+523.107639587" lastFinishedPulling="2026-04-22 18:44:41.611536394 +0000 UTC m=+523.677749444" observedRunningTime="2026-04-22 18:44:41.997173084 +0000 UTC m=+524.063386152" watchObservedRunningTime="2026-04-22 18:44:41.998008026 +0000 UTC m=+524.064221095" Apr 22 18:44:42.512329 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:44:42.512299 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a670dea-62e9-43f2-b144-893e7e2c1040" path="/var/lib/kubelet/pods/2a670dea-62e9-43f2-b144-893e7e2c1040/volumes" Apr 22 18:45:12.974271 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:12.974236 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-d9c56dd68-w6gpc" Apr 22 18:45:13.844066 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.844039 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-wgkjl"] Apr 22 18:45:13.844285 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.844275 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a670dea-62e9-43f2-b144-893e7e2c1040" containerName="manager" Apr 22 18:45:13.844326 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.844286 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a670dea-62e9-43f2-b144-893e7e2c1040" containerName="manager" Apr 22 18:45:13.844358 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.844328 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a670dea-62e9-43f2-b144-893e7e2c1040" containerName="manager" Apr 22 18:45:13.846963 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.846946 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:13.851454 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.851431 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:45:13.851569 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.851550 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-lrtqh\"" Apr 22 18:45:13.861160 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.861139 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wgkjl"] Apr 22 18:45:13.980231 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.980192 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jhh\" (UniqueName: \"kubernetes.io/projected/297fb7d2-59ba-4f1e-9908-19505271ac48-kube-api-access-m6jhh\") pod \"odh-model-controller-696fc77849-wgkjl\" (UID: \"297fb7d2-59ba-4f1e-9908-19505271ac48\") " pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:13.980231 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:13.980233 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/297fb7d2-59ba-4f1e-9908-19505271ac48-cert\") pod \"odh-model-controller-696fc77849-wgkjl\" (UID: \"297fb7d2-59ba-4f1e-9908-19505271ac48\") " pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:14.081106 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:14.081074 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jhh\" (UniqueName: \"kubernetes.io/projected/297fb7d2-59ba-4f1e-9908-19505271ac48-kube-api-access-m6jhh\") pod \"odh-model-controller-696fc77849-wgkjl\" (UID: \"297fb7d2-59ba-4f1e-9908-19505271ac48\") " pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:14.081106 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:14.081105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/297fb7d2-59ba-4f1e-9908-19505271ac48-cert\") pod \"odh-model-controller-696fc77849-wgkjl\" (UID: \"297fb7d2-59ba-4f1e-9908-19505271ac48\") " pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:14.083374 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:14.083356 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/297fb7d2-59ba-4f1e-9908-19505271ac48-cert\") pod \"odh-model-controller-696fc77849-wgkjl\" (UID: \"297fb7d2-59ba-4f1e-9908-19505271ac48\") " pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:14.091646 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:14.091622 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jhh\" (UniqueName: \"kubernetes.io/projected/297fb7d2-59ba-4f1e-9908-19505271ac48-kube-api-access-m6jhh\") pod \"odh-model-controller-696fc77849-wgkjl\" (UID: \"297fb7d2-59ba-4f1e-9908-19505271ac48\") " pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:14.156474 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:14.156442 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:14.265004 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:14.264965 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-wgkjl"] Apr 22 18:45:14.267699 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:45:14.267661 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod297fb7d2_59ba_4f1e_9908_19505271ac48.slice/crio-e79fe44aec35511bd5802ea7362739c5017cc326804d3c9a1d49fa809fb67d76 WatchSource:0}: Error finding container e79fe44aec35511bd5802ea7362739c5017cc326804d3c9a1d49fa809fb67d76: Status 404 returned error can't find the container with id e79fe44aec35511bd5802ea7362739c5017cc326804d3c9a1d49fa809fb67d76 Apr 22 18:45:15.055949 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:15.055912 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wgkjl" event={"ID":"297fb7d2-59ba-4f1e-9908-19505271ac48","Type":"ContainerStarted","Data":"e79fe44aec35511bd5802ea7362739c5017cc326804d3c9a1d49fa809fb67d76"} Apr 22 18:45:18.065704 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:18.065662 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-wgkjl" event={"ID":"297fb7d2-59ba-4f1e-9908-19505271ac48","Type":"ContainerStarted","Data":"913e997597afc6990ab852429d212dfe7d7941cf3eec6bf1d31ed66c16a2c958"} Apr 22 18:45:18.066077 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:18.065809 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:18.083203 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:18.083148 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-wgkjl" podStartSLOduration=2.290194597 podStartE2EDuration="5.083130698s" podCreationTimestamp="2026-04-22 18:45:13 +0000 UTC" firstStartedPulling="2026-04-22 18:45:14.26899173 +0000 UTC m=+556.335204776" lastFinishedPulling="2026-04-22 18:45:17.061927826 +0000 UTC m=+559.128140877" observedRunningTime="2026-04-22 18:45:18.082689424 +0000 UTC m=+560.148902496" watchObservedRunningTime="2026-04-22 18:45:18.083130698 +0000 UTC m=+560.149343774" Apr 22 18:45:29.069890 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:29.069835 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-wgkjl" Apr 22 18:45:48.986640 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:48.986601 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq"] Apr 22 18:45:48.989077 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:48.989062 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" Apr 22 18:45:48.991555 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:48.991533 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4bhxb\"" Apr 22 18:45:48.997228 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:48.997205 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq"] Apr 22 18:45:48.999433 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:48.999415 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" Apr 22 18:45:49.111170 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:49.111137 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq"] Apr 22 18:45:49.114365 ip-10-0-134-126 kubenswrapper[2571]: W0422 18:45:49.114337 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac5d664_9f08_4b2a_ba15_9e9ad100ad0f.slice/crio-26ab179b4e94ba067b09fd1871dc922af673c2dccf419a1bdc06fc945053275a WatchSource:0}: Error finding container 26ab179b4e94ba067b09fd1871dc922af673c2dccf419a1bdc06fc945053275a: Status 404 returned error can't find the container with id 26ab179b4e94ba067b09fd1871dc922af673c2dccf419a1bdc06fc945053275a Apr 22 18:45:49.146483 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:45:49.146454 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" event={"ID":"cac5d664-9f08-4b2a-ba15-9e9ad100ad0f","Type":"ContainerStarted","Data":"26ab179b4e94ba067b09fd1871dc922af673c2dccf419a1bdc06fc945053275a"} Apr 22 18:46:01.185826 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:01.185773 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" event={"ID":"cac5d664-9f08-4b2a-ba15-9e9ad100ad0f","Type":"ContainerStarted","Data":"c4b96091a622ce9316183037e4177e4432e5ca4249b35fc117b46d706faa3614"} Apr 22 18:46:01.186296 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:01.185909 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" Apr 22 18:46:01.187190 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:01.187166 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" podUID="cac5d664-9f08-4b2a-ba15-9e9ad100ad0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:46:01.201574 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:01.201529 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" podStartSLOduration=1.7143822869999998 podStartE2EDuration="13.201513342s" podCreationTimestamp="2026-04-22 18:45:48 +0000 UTC" firstStartedPulling="2026-04-22 18:45:49.115954205 +0000 UTC m=+591.182167252" lastFinishedPulling="2026-04-22 18:46:00.603085257 +0000 UTC m=+602.669298307" observedRunningTime="2026-04-22 18:46:01.200394819 +0000 UTC m=+603.266607888" watchObservedRunningTime="2026-04-22 18:46:01.201513342 +0000 UTC m=+603.267726412" Apr 22 18:46:02.188487 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:02.188454 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" podUID="cac5d664-9f08-4b2a-ba15-9e9ad100ad0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:46:12.189025 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:12.188981 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" podUID="cac5d664-9f08-4b2a-ba15-9e9ad100ad0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:46:22.188570 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:22.188485 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" podUID="cac5d664-9f08-4b2a-ba15-9e9ad100ad0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:46:32.189294 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:32.189251 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" podUID="cac5d664-9f08-4b2a-ba15-9e9ad100ad0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:46:42.188961 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:42.188921 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" podUID="cac5d664-9f08-4b2a-ba15-9e9ad100ad0f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:46:52.189907 ip-10-0-134-126 kubenswrapper[2571]: I0422 18:46:52.189855 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-82b82-predictor-f89b8bb-f4qxq" Apr 22 19:46:06.420627 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:06.420593 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:06.877130 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:06.877099 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:07.369814 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:07.369781 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:07.794536 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:07.794452 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:08.225645 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:08.225609 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:08.657334 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:08.657307 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:09.116424 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:09.116388 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:09.559708 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:09.559629 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:09.991146 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:09.991112 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:10.419104 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:10.419071 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:10.871479 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:10.871444 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:11.307948 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:11.307845 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:11.736818 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:11.736779 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:12.169515 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:12.169489 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:12.611804 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:12.611776 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:13.048812 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:13.048732 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:13.490419 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:13.490390 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:13.946945 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:13.946921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:14.406655 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:14.406629 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:14.886950 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:14.886925 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:15.348230 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:15.348197 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:15.949439 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:15.949408 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:16.382020 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:16.381984 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:16.816910 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:16.816815 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:17.282927 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:17.282894 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:17.726812 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:17.726785 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:18.171846 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:18.171816 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:18.621864 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:18.621836 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:19.074527 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:19.074442 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:19.499978 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:19.499947 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:19.942175 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:19.942144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:20.379224 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:20.379191 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:20.812839 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:20.812767 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:21.290154 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:21.290117 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:21.724475 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:21.724445 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_success-200-isvc-82b82-predictor-f89b8bb-f4qxq_cac5d664-9f08-4b2a-ba15-9e9ad100ad0f/kserve-container/0.log" Apr 22 19:46:26.291089 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:26.291049 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hlw26_6adad479-9e37-4d83-b878-e22fdf6dc3d9/global-pull-secret-syncer/0.log" Apr 22 19:46:26.412834 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:26.412798 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-llq6g_dcc5a374-518f-4fe9-b03c-6c43c581735c/konnectivity-agent/0.log" Apr 22 19:46:26.518973 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:26.518945 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-126.ec2.internal_21199ccb1140108a6b341eb7b765d477/haproxy/0.log" Apr 22 19:46:30.370734 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:30.370709 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hn449_08418377-4601-42e1-a730-06d7b3fedf8c/node-exporter/0.log" Apr 22 19:46:30.389799 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:30.389778 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hn449_08418377-4601-42e1-a730-06d7b3fedf8c/kube-rbac-proxy/0.log" Apr 22 19:46:30.416289 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:30.412372 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hn449_08418377-4601-42e1-a730-06d7b3fedf8c/init-textfile/0.log" Apr 22 19:46:33.336526 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.336493 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql"] Apr 22 19:46:33.339570 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.339552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.342044 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.342016 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5hs9k\"/\"default-dockercfg-9t4h8\"" Apr 22 19:46:33.342144 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.342114 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5hs9k\"/\"openshift-service-ca.crt\"" Apr 22 19:46:33.343017 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.343003 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5hs9k\"/\"kube-root-ca.crt\"" Apr 22 19:46:33.347167 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.347129 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql"] Apr 22 19:46:33.352360 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.352338 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-sys\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.352463 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.352395 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-podres\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.352463 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.352422 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-lib-modules\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.352565 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.352464 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-proc\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.352565 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.352490 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg7kj\" (UniqueName: \"kubernetes.io/projected/e2847867-0b6d-49d0-a002-3e0a60195213-kube-api-access-gg7kj\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453083 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-podres\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453083 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453086 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-lib-modules\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453300 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-proc\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453300 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453147 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg7kj\" (UniqueName: \"kubernetes.io/projected/e2847867-0b6d-49d0-a002-3e0a60195213-kube-api-access-gg7kj\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453300 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453174 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-sys\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453300 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-proc\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453300 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-lib-modules\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453300 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453232 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-podres\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.453300 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.453263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2847867-0b6d-49d0-a002-3e0a60195213-sys\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.475128 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.475098 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg7kj\" (UniqueName: \"kubernetes.io/projected/e2847867-0b6d-49d0-a002-3e0a60195213-kube-api-access-gg7kj\") pod \"perf-node-gather-daemonset-2fwql\" (UID: \"e2847867-0b6d-49d0-a002-3e0a60195213\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.649967 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.649939 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:33.763680 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.763631 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql"] Apr 22 19:46:33.766195 ip-10-0-134-126 kubenswrapper[2571]: W0422 19:46:33.766167 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode2847867_0b6d_49d0_a002_3e0a60195213.slice/crio-12cfa809b2d21c854f586407ebc18ee3cf6dce64efbca5ed605da84a56e2a30a WatchSource:0}: Error finding container 12cfa809b2d21c854f586407ebc18ee3cf6dce64efbca5ed605da84a56e2a30a: Status 404 returned error can't find the container with id 12cfa809b2d21c854f586407ebc18ee3cf6dce64efbca5ed605da84a56e2a30a Apr 22 19:46:33.767683 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:33.767667 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:46:34.288000 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:34.287970 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pfpxj_cc7162b6-24ec-432c-8317-6eff35ee7f87/dns/0.log" Apr 22 19:46:34.307060 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:34.307017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pfpxj_cc7162b6-24ec-432c-8317-6eff35ee7f87/kube-rbac-proxy/0.log" Apr 22 19:46:34.327671 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:34.327650 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-47tpt_2bd7e4a3-0f74-4ca3-848f-f713afd48c22/dns-node-resolver/0.log" Apr 22 19:46:34.704599 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:34.704568 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" event={"ID":"e2847867-0b6d-49d0-a002-3e0a60195213","Type":"ContainerStarted","Data":"a187b34b89f175ac0be56b60448b6ccfd74466b829dce13805b8d1aa08bf494f"} Apr 22 19:46:34.704599 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:34.704601 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" event={"ID":"e2847867-0b6d-49d0-a002-3e0a60195213","Type":"ContainerStarted","Data":"12cfa809b2d21c854f586407ebc18ee3cf6dce64efbca5ed605da84a56e2a30a"} Apr 22 19:46:34.705027 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:34.704680 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:34.722103 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:34.722066 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" podStartSLOduration=1.72205413 podStartE2EDuration="1.72205413s" podCreationTimestamp="2026-04-22 19:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:46:34.720838743 +0000 UTC m=+4236.787051836" watchObservedRunningTime="2026-04-22 19:46:34.72205413 +0000 UTC m=+4236.788267198" Apr 22 19:46:34.812317 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:34.812290 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2df55_20357bd7-0dd6-4792-92d1-d5705073a86e/node-ca/0.log" Apr 22 19:46:35.886526 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:35.886493 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-v26l6_8e29b790-9fe8-4412-be58-f5c6ee203578/serve-healthcheck-canary/0.log" Apr 22 19:46:36.256158 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:36.256068 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6lmd2_2226793d-aee6-48b4-898d-378b00cbc606/kube-rbac-proxy/0.log" Apr 22 19:46:36.274681 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:36.274655 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6lmd2_2226793d-aee6-48b4-898d-378b00cbc606/exporter/0.log" Apr 22 19:46:36.294207 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:36.294189 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6lmd2_2226793d-aee6-48b4-898d-378b00cbc606/extractor/0.log" Apr 22 19:46:38.282147 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:38.282106 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-d9c56dd68-w6gpc_990031f4-a957-446a-8717-b366a43ef49a/manager/0.log" Apr 22 19:46:38.791612 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:38.791579 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-wgkjl_297fb7d2-59ba-4f1e-9908-19505271ac48/manager/0.log" Apr 22 19:46:38.840707 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:38.840676 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-4phhd_298c057f-f82b-4451-87f2-784d2bc3602d/seaweedfs/0.log" Apr 22 19:46:40.716937 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:40.716911 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-2fwql" Apr 22 19:46:43.869461 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:43.869434 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d5f7s_b1680f68-7360-441e-8836-c911ac062e82/kube-multus-additional-cni-plugins/0.log" Apr 22 19:46:43.889731 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:43.889702 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d5f7s_b1680f68-7360-441e-8836-c911ac062e82/egress-router-binary-copy/0.log" Apr 22 19:46:43.910252 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:43.910230 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d5f7s_b1680f68-7360-441e-8836-c911ac062e82/cni-plugins/0.log" Apr 22 19:46:43.929430 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:43.929413 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d5f7s_b1680f68-7360-441e-8836-c911ac062e82/bond-cni-plugin/0.log" Apr 22 19:46:43.951995 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:43.951974 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d5f7s_b1680f68-7360-441e-8836-c911ac062e82/routeoverride-cni/0.log" Apr 22 19:46:43.972798 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:43.972782 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d5f7s_b1680f68-7360-441e-8836-c911ac062e82/whereabouts-cni-bincopy/0.log" Apr 22 19:46:43.995627 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:43.995609 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d5f7s_b1680f68-7360-441e-8836-c911ac062e82/whereabouts-cni/0.log" Apr 22 19:46:44.237861 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:44.237831 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ql2h8_539b75c3-f4eb-4f78-bdd1-438056001519/kube-multus/0.log" Apr 22 19:46:44.255835 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:44.255803 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8llv8_7ff729b2-3f8f-4665-ae12-c83c4b179998/network-metrics-daemon/0.log" Apr 22 19:46:44.275832 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:44.275813 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8llv8_7ff729b2-3f8f-4665-ae12-c83c4b179998/kube-rbac-proxy/0.log" Apr 22 19:46:45.077975 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:45.077952 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ncvvr_578f7fc7-df68-49e8-ab1c-e8782370ea85/ovn-controller/0.log" Apr 22 19:46:45.130669 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:45.130629 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ncvvr_578f7fc7-df68-49e8-ab1c-e8782370ea85/ovn-acl-logging/0.log" Apr 22 19:46:45.152615 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:45.152591 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ncvvr_578f7fc7-df68-49e8-ab1c-e8782370ea85/kube-rbac-proxy-node/0.log" Apr 22 19:46:45.174921 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:45.174897 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ncvvr_578f7fc7-df68-49e8-ab1c-e8782370ea85/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:46:45.190072 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:45.190049 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ncvvr_578f7fc7-df68-49e8-ab1c-e8782370ea85/northd/0.log" Apr 22 19:46:45.208983 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:45.208950 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ncvvr_578f7fc7-df68-49e8-ab1c-e8782370ea85/nbdb/0.log" Apr 22 19:46:45.233410 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:45.233392 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ncvvr_578f7fc7-df68-49e8-ab1c-e8782370ea85/sbdb/0.log" Apr 22 19:46:45.393737 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:45.393717 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ncvvr_578f7fc7-df68-49e8-ab1c-e8782370ea85/ovnkube-controller/0.log" Apr 22 19:46:47.067418 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:47.067392 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sd9g2_743f27e4-2f91-43c9-a360-363424f5563c/network-check-target-container/0.log" Apr 22 19:46:48.017700 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:48.017667 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xwhrw_481b4ac3-0a49-4659-8d45-8a350c162d28/iptables-alerter/0.log" Apr 22 19:46:48.576100 ip-10-0-134-126 kubenswrapper[2571]: I0422 19:46:48.576074 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bwbj7_bcfd0f0f-fa19-459b-9541-ffe992fad530/tuned/0.log"