Apr 20 14:55:06.285980 ip-10-0-141-9 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:55:06.731560 ip-10-0-141-9 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:06.731560 ip-10-0-141-9 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:55:06.731560 ip-10-0-141-9 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:06.731560 ip-10-0-141-9 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:55:06.731560 ip-10-0-141-9 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:06.732998 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.731648 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:55:06.734461 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734435 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:06.734461 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734457 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:06.734461 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734465 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:06.734461 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734469 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734475 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734479 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734483 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734487 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734491 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734495 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734498 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734502 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734505 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734509 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734513 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734517 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734521 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734525 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734529 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734533 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734537 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734541 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734545 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:06.734751 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734553 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734558 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734561 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734565 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734569 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734572 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734576 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734579 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734583 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734587 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734591 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734594 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734598 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734603 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734608 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734613 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734617 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734621 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734626 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:06.735552 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734630 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734634 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734640 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734644 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734648 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734652 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734656 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734660 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734664 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734669 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734673 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734678 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734701 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734704 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734708 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734713 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734717 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734720 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734725 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734729 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:06.736233 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734735 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734739 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734743 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734747 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734751 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734755 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734759 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734765 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734769 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734773 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734778 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734782 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734788 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734794 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734797 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734802 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734806 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734810 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734814 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734818 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:06.736739 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734822 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734827 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734832 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.734836 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735453 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735463 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735469 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735473 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735479 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735484 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735488 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735493 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735497 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735501 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735506 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735510 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735514 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735519 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735524 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:06.737297 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735529 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735534 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735538 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735542 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735546 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735550 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735555 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735559 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735563 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735566 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735570 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735574 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735578 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735583 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735587 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735591 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735596 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735601 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735605 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735609 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:06.737832 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735613 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735619 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735624 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735629 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735633 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735637 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735642 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735646 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735650 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735655 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735660 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735664 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735669 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735673 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735697 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735702 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735707 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735711 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735715 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735720 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:06.738426 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735724 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735728 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735732 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735736 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735741 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735745 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735749 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735753 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735757 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735762 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735766 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735771 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735775 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735779 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735783 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735790 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735794 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735798 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735802 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735807 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:06.739307 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735811 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735815 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735819 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735823 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735828 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735833 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735837 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735842 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735846 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735849 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.735853 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738275 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738296 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738308 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738316 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738323 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738328 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738335 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738343 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738348 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738353 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:55:06.740048 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738359 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738365 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738370 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738376 2571 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738381 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738386 2571 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738391 2571 flags.go:64] FLAG: --cloud-config="" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738395 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738400 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738408 2571 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738413 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738419 2571 flags.go:64] FLAG: --config-dir="" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738423 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738429 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738435 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738440 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738445 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738451 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738456 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738461 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738466 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738471 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738476 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738483 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738488 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:55:06.740674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738493 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738498 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738503 2571 flags.go:64] FLAG: --enable-server="true" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738508 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738516 2571 flags.go:64] FLAG: --event-burst="100" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738521 2571 flags.go:64] FLAG: --event-qps="50" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738526 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738531 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738537 2571 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738543 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738548 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738553 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738558 2571 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738563 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738569 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738573 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738578 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738584 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738589 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738593 2571 flags.go:64] FLAG: --feature-gates="" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738600 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738605 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738610 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738615 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738620 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:55:06.741420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738625 2571 flags.go:64] FLAG: --help="false" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738630 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-141-9.ec2.internal" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738635 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738640 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738645 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738651 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738656 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738662 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738667 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738672 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738677 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738698 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738704 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738708 2571 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738714 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738719 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738724 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738729 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738733 2571 flags.go:64] FLAG: --lock-file="" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738738 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738742 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738748 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738757 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:55:06.742098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738761 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738767 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738771 2571 flags.go:64] FLAG: --logging-format="text" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738776 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738782 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738787 2571 flags.go:64] FLAG: --manifest-url="" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738791 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738799 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738804 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738810 2571 flags.go:64] FLAG: --max-pods="110" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738815 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738819 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738825 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738830 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738835 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738840 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738844 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738859 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738864 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738870 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738874 2571 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738879 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738888 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738893 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:55:06.742676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738898 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738904 2571 flags.go:64] FLAG: --port="10250" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738909 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738914 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07211737ff283c44a" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738920 2571 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738924 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738929 2571 flags.go:64] FLAG: --register-node="true" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738934 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738939 2571 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738945 2571 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738949 2571 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738954 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738959 2571 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738965 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738970 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738975 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738979 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738984 2571 flags.go:64] FLAG: --runonce="false" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738989 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.738995 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739000 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739004 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739009 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739014 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739019 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739024 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:55:06.743300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739030 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739035 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739040 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739045 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739050 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739055 2571 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739060 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739070 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739075 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739080 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739087 2571 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739092 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739096 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739101 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739106 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739111 2571 flags.go:64] FLAG: --v="2" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739118 2571 flags.go:64] FLAG: --version="false" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739124 2571 flags.go:64] FLAG: --vmodule="" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739131 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739137 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739285 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739291 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739298 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:06.743966 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739304 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739311 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739318 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739322 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739326 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739331 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739335 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739340 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739344 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739348 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739353 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739358 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739362 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739366 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739370 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739374 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739379 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739383 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739387 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739391 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:06.744515 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739395 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739399 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739403 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739407 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739411 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739415 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739419 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739424 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739428 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739432 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739436 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739441 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739445 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739449 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739453 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739458 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739462 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739466 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739470 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739474 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:06.745079 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739478 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739482 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739487 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739492 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739496 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739500 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739504 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739508 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739513 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739517 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739521 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739525 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739529 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739534 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739538 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739542 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739545 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739549 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739553 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739558 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:06.745580 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739562 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739566 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739571 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739575 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739579 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739583 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739586 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739591 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739595 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739599 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739603 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739607 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739612 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739616 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739620 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739624 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739629 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739633 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739637 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:06.746202 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739642 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:06.746665 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739652 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:06.746665 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739656 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:06.746665 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.739660 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:06.746665 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.739668 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:06.748302 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.748283 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:55:06.748340 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.748304 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:55:06.748371 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748366 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:06.748371 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748371 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748375 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748379 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748383 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748388 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748391 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748394 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748397 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748400 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748403 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748406 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748408 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748411 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748414 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748416 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748419 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748422 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748424 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:06.748420 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748428 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748431 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748434 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748436 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748439 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748442 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748444 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748447 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748449 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748452 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748454 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748457 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748460 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748462 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748465 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748468 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748470 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748473 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748475 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748478 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:06.748885 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748480 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748483 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748485 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748487 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748490 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748492 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748495 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748497 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748500 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748502 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748505 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748508 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748511 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748514 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748517 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748520 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748523 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748525 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748528 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748530 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:06.749377 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748533 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748535 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748538 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748540 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748543 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748546 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748549 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748551 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748554 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748556 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748559 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748561 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748564 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748566 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748568 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748571 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748573 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748576 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748579 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748581 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:06.749887 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748583 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748587 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748591 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748594 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748597 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748599 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748602 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.748607 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748721 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748727 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748731 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748734 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748738 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748741 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748744 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:06.750397 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748747 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748751 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748754 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748758 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748761 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748764 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748766 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748769 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748772 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748774 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748777 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748779 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748782 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748784 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748787 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748789 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748792 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748795 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748797 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:06.750783 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748800 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748802 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748805 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748808 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748811 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748814 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748816 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748819 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748821 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748824 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748826 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748829 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748831 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748834 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748836 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748839 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748841 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748844 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748847 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748849 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:06.751282 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748852 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748854 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748857 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748860 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748862 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748865 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748867 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748870 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748873 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748875 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748878 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748880 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748883 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748885 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748888 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748890 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748893 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748896 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748899 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748901 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:06.751791 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748904 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748906 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748909 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748911 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748914 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748916 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748919 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748921 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748924 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748926 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748929 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748931 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748934 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748936 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748939 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748941 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748944 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748946 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748949 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:06.752281 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:06.748952 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:06.752744 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.748958 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:06.752744 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.749061 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:55:06.753303 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.753272 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:55:06.754129 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.754117 2571 server.go:1019] "Starting client certificate rotation" Apr 20 14:55:06.754240 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.754224 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:55:06.754310 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.754264 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:55:06.780033 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.780014 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:55:06.788817 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.788791 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:55:06.801800 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.801779 2571 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:55:06.809044 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.809023 2571 log.go:25] "Validated CRI v1 image API" Apr 20 14:55:06.812740 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.812724 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:55:06.814158 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.814142 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:55:06.818049 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.818022 2571 fs.go:135] Filesystem UUIDs: map[68a3353e-f512-49fb-b563-64f8e6b26018:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fcae5f34-0b97-4f38-8fa1-bdb9716f9046:/dev/nvme0n1p3] Apr 20 14:55:06.818119 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.818048 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:55:06.824519 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.824402 2571 manager.go:217] Machine: {Timestamp:2026-04-20 14:55:06.822083883 +0000 UTC m=+0.415902936 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099749 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b72bc10a93e14a2f17286c9985e5c SystemUUID:ec2b72bc-10a9-3e14-a2f1-7286c9985e5c BootID:318f1f48-e4af-476e-ad4f-d252914f55da Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:16:55:65:ae:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:16:55:65:ae:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ea:dc:10:75:70:5e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:55:06.824519 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.824514 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:55:06.824649 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.824637 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:55:06.825849 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.825823 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:55:06.826004 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.825852 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-9.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:55:06.826054 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.826013 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:55:06.826054 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.826022 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:55:06.826054 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.826035 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:55:06.826761 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.826750 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:55:06.827751 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.827741 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:55:06.827863 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.827854 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:55:06.830420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.830411 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:55:06.830460 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.830431 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:55:06.830460 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.830444 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:55:06.830460 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.830453 2571 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:55:06.830571 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.830466 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:55:06.831743 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.831729 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:55:06.831820 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.831749 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:55:06.835062 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.835045 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:55:06.836867 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.836851 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:55:06.838576 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838562 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:55:06.838648 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838584 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:55:06.838648 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838593 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:55:06.838648 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838602 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:55:06.838648 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838611 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:55:06.838648 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838620 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:55:06.838648 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838629 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:55:06.838648 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838637 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:55:06.838648 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838647 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:55:06.838929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838658 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:55:06.838929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838670 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:55:06.838929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.838706 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:55:06.839932 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.839921 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:55:06.839982 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.839937 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:55:06.842145 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.842123 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-9.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 14:55:06.842145 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.842131 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-9.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 14:55:06.842414 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.842364 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 14:55:06.844387 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.844369 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:55:06.844465 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.844419 2571 server.go:1295] "Started kubelet" Apr 20 14:55:06.844525 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.844504 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:55:06.844609 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.844566 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:55:06.844771 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.844624 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:55:06.845371 ip-10-0-141-9 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:55:06.845925 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.845888 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:55:06.847698 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.847672 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:55:06.851273 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.851256 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:55:06.851925 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.851904 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:55:06.852637 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.852620 2571 factory.go:55] Registering systemd factory Apr 20 14:55:06.852981 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.852961 2571 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:55:06.853095 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.852740 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:55:06.853095 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.852803 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:06.853192 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.852626 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:55:06.853237 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853226 2571 factory.go:153] Registering CRI-O factory Apr 20 14:55:06.853237 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853235 2571 factory.go:223] Registration of the crio container factory successfully Apr 20 14:55:06.853332 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853286 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:55:06.853537 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853516 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-52x7x" Apr 20 14:55:06.853597 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853560 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:55:06.853710 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853666 2571 factory.go:103] Registering Raw factory Apr 20 14:55:06.853710 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853677 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:55:06.853819 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853716 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:55:06.853819 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.853707 2571 manager.go:1196] Started watching for new ooms in manager Apr 20 14:55:06.854163 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.854139 2571 manager.go:319] Starting recovery of all containers Apr 20 14:55:06.855009 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.854976 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 14:55:06.859676 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.859653 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-52x7x" Apr 20 14:55:06.859676 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.859659 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 14:55:06.859864 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.859820 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-9.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 14:55:06.860645 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.859760 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-9.ec2.internal.18a8186c65feb088 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-9.ec2.internal,UID:ip-10-0-141-9.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-9.ec2.internal,},FirstTimestamp:2026-04-20 14:55:06.844385416 +0000 UTC m=+0.438204473,LastTimestamp:2026-04-20 14:55:06.844385416 +0000 UTC m=+0.438204473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-9.ec2.internal,}" Apr 20 14:55:06.865150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.865132 2571 manager.go:324] Recovery completed Apr 20 14:55:06.868113 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.868088 2571 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 14:55:06.871701 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.871675 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:06.874437 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.874423 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:06.874508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.874450 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:06.874508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.874460 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:06.875004 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.874989 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:55:06.875004 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.875002 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:55:06.875113 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.875017 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:55:06.877866 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.877854 2571 policy_none.go:49] "None policy: Start" Apr 20 14:55:06.877912 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.877870 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:55:06.877912 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.877880 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:55:06.913260 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.913244 2571 manager.go:341] "Starting Device Plugin manager" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.913321 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.913336 2571 server.go:85] "Starting device plugin registration server" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.913595 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.913607 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.913703 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.913782 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.913792 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.914321 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.914360 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.926028 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.927208 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.927228 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.927245 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.927251 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:55:06.928418 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:06.927280 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:55:06.929306 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:06.929289 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:07.014370 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.014280 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:07.015644 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.015626 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:07.015767 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.015656 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:07.015767 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.015666 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:07.015767 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.015706 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.024986 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.024967 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.025067 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.024990 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-9.ec2.internal\": node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.027416 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.027387 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal"] Apr 20 14:55:07.027492 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.027458 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:07.028308 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.028287 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:07.028410 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.028319 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:07.028410 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.028330 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:07.029782 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.029770 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:07.029952 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.029917 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.030015 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.029966 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:07.030396 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.030378 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:07.030396 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.030400 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:07.030513 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.030379 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:07.030513 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.030430 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:07.030513 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.030441 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:07.030513 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.030410 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:07.031675 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.031660 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.031785 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.031701 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:07.032302 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.032287 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:07.032374 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.032313 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:07.032374 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.032326 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:07.047208 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.047188 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-9.ec2.internal\" not found" node="ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.051106 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.051087 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-9.ec2.internal\" not found" node="ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.053562 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.053547 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.054599 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.054584 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e40ef7a2f6c320642742d71b7138ad0e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal\" (UID: \"e40ef7a2f6c320642742d71b7138ad0e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.054644 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.054613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e40ef7a2f6c320642742d71b7138ad0e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal\" (UID: \"e40ef7a2f6c320642742d71b7138ad0e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.054644 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.054633 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e193ebdeb87d4b9c6bb9f329d9d23d3d-config\") pod \"kube-apiserver-proxy-ip-10-0-141-9.ec2.internal\" (UID: \"e193ebdeb87d4b9c6bb9f329d9d23d3d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.154627 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.154600 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.155770 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.155750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e40ef7a2f6c320642742d71b7138ad0e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal\" (UID: \"e40ef7a2f6c320642742d71b7138ad0e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.155829 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.155787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e40ef7a2f6c320642742d71b7138ad0e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal\" (UID: \"e40ef7a2f6c320642742d71b7138ad0e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.155829 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.155813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e193ebdeb87d4b9c6bb9f329d9d23d3d-config\") pod \"kube-apiserver-proxy-ip-10-0-141-9.ec2.internal\" (UID: \"e193ebdeb87d4b9c6bb9f329d9d23d3d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.155890 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.155839 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e193ebdeb87d4b9c6bb9f329d9d23d3d-config\") pod \"kube-apiserver-proxy-ip-10-0-141-9.ec2.internal\" (UID: \"e193ebdeb87d4b9c6bb9f329d9d23d3d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.155890 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.155848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e40ef7a2f6c320642742d71b7138ad0e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal\" (UID: \"e40ef7a2f6c320642742d71b7138ad0e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.155890 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.155848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e40ef7a2f6c320642742d71b7138ad0e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal\" (UID: \"e40ef7a2f6c320642742d71b7138ad0e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.255105 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.255076 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.348666 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.348597 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.353337 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.353315 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" Apr 20 14:55:07.355429 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.355413 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.456142 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.456099 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.556594 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.556558 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.657139 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.657058 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.754565 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.754532 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:55:07.755409 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.754673 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:55:07.757649 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.757628 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.851345 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.851315 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:55:07.858452 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.858429 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:07.862285 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.862255 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:50:06 +0000 UTC" deadline="2027-09-20 21:30:05.447679665 +0000 UTC" Apr 20 14:55:07.862285 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.862283 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12438h34m57.585399553s" Apr 20 14:55:07.862532 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.862520 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:55:07.885882 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.885858 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5npdc" Apr 20 14:55:07.890034 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.890016 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:07.890825 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:07.890803 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode40ef7a2f6c320642742d71b7138ad0e.slice/crio-6db398143b5308d158d37cd78df0fca4d0a8c0b0c2d2aae279fc2cdae511b7bc WatchSource:0}: Error finding container 6db398143b5308d158d37cd78df0fca4d0a8c0b0c2d2aae279fc2cdae511b7bc: Status 404 returned error can't find the container with id 6db398143b5308d158d37cd78df0fca4d0a8c0b0c2d2aae279fc2cdae511b7bc Apr 20 14:55:07.891129 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:07.891114 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode193ebdeb87d4b9c6bb9f329d9d23d3d.slice/crio-069f9319358e4ab22ed899f5b9976dea1b15cbcf095a515696bfb6349cb23ce2 WatchSource:0}: Error finding container 069f9319358e4ab22ed899f5b9976dea1b15cbcf095a515696bfb6349cb23ce2: Status 404 returned error can't find the container with id 069f9319358e4ab22ed899f5b9976dea1b15cbcf095a515696bfb6349cb23ce2 Apr 20 14:55:07.893928 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.893909 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5npdc" Apr 20 14:55:07.894665 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.894650 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:55:07.930481 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.930381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" event={"ID":"e193ebdeb87d4b9c6bb9f329d9d23d3d","Type":"ContainerStarted","Data":"069f9319358e4ab22ed899f5b9976dea1b15cbcf095a515696bfb6349cb23ce2"} Apr 20 14:55:07.931839 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:07.931808 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" event={"ID":"e40ef7a2f6c320642742d71b7138ad0e","Type":"ContainerStarted","Data":"6db398143b5308d158d37cd78df0fca4d0a8c0b0c2d2aae279fc2cdae511b7bc"} Apr 20 14:55:07.959211 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:07.959191 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:08.059761 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.059727 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:08.089744 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.089717 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:08.160259 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.160213 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:08.260798 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.260726 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-9.ec2.internal\" not found" Apr 20 14:55:08.350381 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.350267 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:08.352526 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.352508 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" Apr 20 14:55:08.366036 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.366006 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:55:08.367086 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.367063 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" Apr 20 14:55:08.377754 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.377733 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:55:08.831771 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.831743 2571 apiserver.go:52] "Watching apiserver" Apr 20 14:55:08.837576 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.837552 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:55:08.839571 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.839547 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-t2gsq","openshift-network-operator/iptables-alerter-dqx88","openshift-ovn-kubernetes/ovnkube-node-qpw9h","kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s","openshift-cluster-node-tuning-operator/tuned-d7grf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal","openshift-multus/network-metrics-daemon-7v79q","openshift-network-diagnostics/network-check-target-x9fss","kube-system/konnectivity-agent-v5dst","openshift-dns/node-resolver-58fm4","openshift-image-registry/node-ca-gk5rl","openshift-multus/multus-additional-cni-plugins-kfbrh"] Apr 20 14:55:08.842593 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.842572 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.842918 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.842869 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.844004 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.843982 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.845022 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.844998 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:55:08.845119 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.845035 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4fsdr\"" Apr 20 14:55:08.845176 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.845154 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:55:08.845256 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.845238 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:55:08.845619 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.845599 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:08.845619 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.845615 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:08.845866 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.845647 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:55:08.845866 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.845599 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6zlnz\"" Apr 20 14:55:08.846394 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.846374 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:55:08.846479 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.846430 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:55:08.846479 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.846467 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:55:08.846638 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.846616 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.846638 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.846625 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:55:08.847213 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.846708 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:55:08.847213 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.846737 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.847315 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.847237 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:55:08.847315 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.847238 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-whf9x\"" Apr 20 14:55:08.848187 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.848169 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.848808 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.848706 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:55:08.848808 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.848726 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:55:08.848996 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.848977 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:55:08.849100 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.849002 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:55:08.849241 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.849220 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mgzjq\"" Apr 20 14:55:08.849241 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.849238 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:55:08.849389 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.849275 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:55:08.849536 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.849517 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jc2sv\"" Apr 20 14:55:08.849631 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.849601 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:55:08.850201 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.850003 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:08.850335 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.850302 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.850335 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.850318 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:08.850462 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.850336 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ljhlf\"" Apr 20 14:55:08.852128 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.852108 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:08.852420 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.852209 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:08.852420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.852348 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:55:08.852420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.852367 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bwj5r\"" Apr 20 14:55:08.852420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.852378 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:55:08.853669 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.853633 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:08.854304 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.854112 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:08.858884 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.858864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:08.858976 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.858879 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:08.861235 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.861215 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lb9tw\"" Apr 20 14:55:08.861347 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.861303 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:55:08.861347 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.861326 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:55:08.861451 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.861310 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:55:08.861500 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.861484 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mx55k\"" Apr 20 14:55:08.861570 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.861555 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:55:08.864637 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864614 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-cni-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.864733 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864654 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-cni-bin\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.864733 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw2n4\" (UniqueName: \"kubernetes.io/projected/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-kube-api-access-cw2n4\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.864733 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864718 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-device-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.864883 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864741 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99wnz\" (UniqueName: \"kubernetes.io/projected/2e10015e-64f6-4b90-b27b-5d53c810c05d-kube-api-access-99wnz\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.864883 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-slash\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.864883 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-ovn\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.864883 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-log-socket\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.864883 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-host\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864918 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-etc-selinux\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864935 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-cni-multus\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-env-overrides\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864967 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-registration-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.864989 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-netns\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865011 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0827ffc7-2165-4812-b9cf-29976d74ffc2-konnectivity-ca\") pod \"konnectivity-agent-v5dst\" (UID: \"0827ffc7-2165-4812-b9cf-29976d74ffc2\") " pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865033 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-run-netns\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865057 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.865085 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865107 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-systemd\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-run\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865151 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-sys-fs\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865175 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-hostroot\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-conf-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysconfig\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-tuned\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-kubelet\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw6wl\" (UniqueName: \"kubernetes.io/projected/026bd687-3320-46f1-b7ea-f615e5b5a821-kube-api-access-rw6wl\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865405 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0827ffc7-2165-4812-b9cf-29976d74ffc2-agent-certs\") pod \"konnectivity-agent-v5dst\" (UID: \"0827ffc7-2165-4812-b9cf-29976d74ffc2\") " pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-var-lib-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865450 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-modprobe-d\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865480 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-kubernetes\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.865508 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865501 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysctl-d\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865534 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-kubelet\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-cni-netd\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865563 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-tmp\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-system-cni-dir\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865612 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-cni-binary-copy\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865631 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865761 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6sxn\" (UniqueName: \"kubernetes.io/projected/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-kube-api-access-h6sxn\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865790 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-systemd-units\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-socket-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-multus-certs\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-etc-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.865976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l5cw\" (UniqueName: \"kubernetes.io/projected/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-kube-api-access-2l5cw\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866011 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-os-release\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lvc6\" (UniqueName: \"kubernetes.io/projected/d1dafe36-2ae8-4593-82df-fbff4eee87b1-kube-api-access-8lvc6\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:08.866258 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866075 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-serviceca\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-etc-kubernetes\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866123 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-node-log\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866143 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-cni-bin\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-var-lib-kubelet\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6sf2\" (UniqueName: \"kubernetes.io/projected/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-kube-api-access-s6sf2\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-host-slash\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovnkube-config\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866308 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovn-node-metrics-cert\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866333 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysctl-conf\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-sys\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866400 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-cnibin\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866425 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-cnibin\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/026bd687-3320-46f1-b7ea-f615e5b5a821-cni-binary-copy\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-host\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866568 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-socket-dir-parent\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.866887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-k8s-cni-cncf-io\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovnkube-script-lib\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-iptables-alerter-script\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-daemon-config\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzqpn\" (UniqueName: \"kubernetes.io/projected/41e84f20-505c-41ef-8790-7da38a92ada4-kube-api-access-pzqpn\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866748 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-lib-modules\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866773 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866809 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-system-cni-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866832 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-os-release\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866854 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-systemd\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.867520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.866869 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.894931 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.894908 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:50:07 +0000 UTC" deadline="2027-10-21 23:48:29.239749309 +0000 UTC" Apr 20 14:55:08.894931 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.894930 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13184h53m20.34482133s" Apr 20 14:55:08.954158 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.954125 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:55:08.967813 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-node-log\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967825 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-cni-bin\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-var-lib-kubelet\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967884 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6sf2\" (UniqueName: \"kubernetes.io/projected/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-kube-api-access-s6sf2\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967900 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-host-slash\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-node-log\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967908 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-cni-bin\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.967958 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovnkube-config\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967975 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovn-node-metrics-cert\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-var-lib-kubelet\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysctl-conf\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.967977 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-host-slash\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-sys\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968054 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-cnibin\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-cnibin\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968105 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/026bd687-3320-46f1-b7ea-f615e5b5a821-cni-binary-copy\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968122 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-sys\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968131 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-host\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968136 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-cnibin\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysctl-conf\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968168 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-host\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968168 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-socket-dir-parent\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-cnibin\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968210 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-k8s-cni-cncf-io\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968209 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-socket-dir-parent\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.968393 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968240 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovnkube-script-lib\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968253 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-k8s-cni-cncf-io\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-iptables-alerter-script\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968293 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-daemon-config\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzqpn\" (UniqueName: \"kubernetes.io/projected/41e84f20-505c-41ef-8790-7da38a92ada4-kube-api-access-pzqpn\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-lib-modules\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968418 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968460 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-lib-modules\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968654 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovnkube-config\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968753 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-system-cni-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-os-release\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovnkube-script-lib\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/026bd687-3320-46f1-b7ea-f615e5b5a821-cni-binary-copy\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968804 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-systemd\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-system-cni-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968870 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-os-release\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969199 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968873 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-daemon-config\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968883 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-cni-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-systemd\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-iptables-alerter-script\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-cni-bin\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-cni-bin\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-cni-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968960 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cw2n4\" (UniqueName: \"kubernetes.io/projected/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-kube-api-access-cw2n4\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.968982 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-device-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99wnz\" (UniqueName: \"kubernetes.io/projected/2e10015e-64f6-4b90-b27b-5d53c810c05d-kube-api-access-99wnz\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969027 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-slash\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969040 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-device-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-ovn\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969072 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-log-socket\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-host\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969102 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-slash\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.969968 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-etc-selinux\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-cni-multus\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-ovn\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969173 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-log-socket\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-env-overrides\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969216 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-cni-multus\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969217 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-registration-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969277 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-etc-selinux\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969289 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwprn\" (UniqueName: \"kubernetes.io/projected/b535020a-3ebe-44bb-8180-63bb281aceff-kube-api-access-xwprn\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-netns\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969542 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0827ffc7-2165-4812-b9cf-29976d74ffc2-konnectivity-ca\") pod \"konnectivity-agent-v5dst\" (UID: \"0827ffc7-2165-4812-b9cf-29976d74ffc2\") " pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969559 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-run-netns\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969581 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-registration-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:08.970787 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-systemd\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969649 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-run\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969671 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b535020a-3ebe-44bb-8180-63bb281aceff-tmp-dir\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969707 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-sys-fs\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969726 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-hostroot\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969740 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-conf-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysconfig\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-tuned\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-kubelet\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969830 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw6wl\" (UniqueName: \"kubernetes.io/projected/026bd687-3320-46f1-b7ea-f615e5b5a821-kube-api-access-rw6wl\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0827ffc7-2165-4812-b9cf-29976d74ffc2-agent-certs\") pod \"konnectivity-agent-v5dst\" (UID: \"0827ffc7-2165-4812-b9cf-29976d74ffc2\") " pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-var-lib-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969884 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-modprobe-d\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969903 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-kubernetes\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysctl-d\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969954 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-kubelet\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-hostroot\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.971527 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969977 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-cni-netd\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970020 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-multus-conf-dir\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-tmp\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-system-cni-dir\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-netns\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-run-netns\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969602 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-env-overrides\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970102 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-cni-binary-copy\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-run-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6sxn\" (UniqueName: \"kubernetes.io/projected/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-kube-api-access-h6sxn\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970169 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-cni-netd\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970170 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-run\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.969326 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-host\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970188 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-systemd-units\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970224 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-socket-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-multus-certs\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.972388 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-etc-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2l5cw\" (UniqueName: \"kubernetes.io/projected/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-kube-api-access-2l5cw\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-os-release\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970390 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lvc6\" (UniqueName: \"kubernetes.io/projected/d1dafe36-2ae8-4593-82df-fbff4eee87b1-kube-api-access-8lvc6\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970417 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b535020a-3ebe-44bb-8180-63bb281aceff-hosts-file\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-serviceca\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970448 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-systemd\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-etc-kubernetes\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970508 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-etc-kubernetes\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysconfig\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970633 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-cni-binary-copy\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970768 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-socket-dir\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970226 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-systemd-units\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-modprobe-d\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.973150 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-os-release\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-var-lib-kubelet\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/026bd687-3320-46f1-b7ea-f615e5b5a821-host-run-multus-certs\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0827ffc7-2165-4812-b9cf-29976d74ffc2-konnectivity-ca\") pod \"konnectivity-agent-v5dst\" (UID: \"0827ffc7-2165-4812-b9cf-29976d74ffc2\") " pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970885 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-var-lib-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970899 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-etc-openvswitch\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-host-kubelet\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-serviceca\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970978 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-kubernetes\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-sysctl-d\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.970032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2e10015e-64f6-4b90-b27b-5d53c810c05d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.971037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/41e84f20-505c-41ef-8790-7da38a92ada4-sys-fs\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.971035 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e10015e-64f6-4b90-b27b-5d53c810c05d-system-cni-dir\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.969414 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.971274 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs podName:d1dafe36-2ae8-4593-82df-fbff4eee87b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:09.471234981 +0000 UTC m=+3.065054027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs") pod "network-metrics-daemon-7v79q" (UID: "d1dafe36-2ae8-4593-82df-fbff4eee87b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.972285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-ovn-node-metrics-cert\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.972968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-tmp\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.973922 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.973828 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0827ffc7-2165-4812-b9cf-29976d74ffc2-agent-certs\") pod \"konnectivity-agent-v5dst\" (UID: \"0827ffc7-2165-4812-b9cf-29976d74ffc2\") " pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:08.974530 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.973979 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-etc-tuned\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.976630 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.976607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6sf2\" (UniqueName: \"kubernetes.io/projected/7be4d4a0-b5c2-4857-a3ae-245ad4430c7c-kube-api-access-s6sf2\") pod \"node-ca-gk5rl\" (UID: \"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c\") " pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:08.977172 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.977147 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzqpn\" (UniqueName: \"kubernetes.io/projected/41e84f20-505c-41ef-8790-7da38a92ada4-kube-api-access-pzqpn\") pod \"aws-ebs-csi-driver-node-vs87s\" (UID: \"41e84f20-505c-41ef-8790-7da38a92ada4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:08.977299 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.977171 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:08.977299 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.977190 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:08.977299 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.977203 2571 projected.go:194] Error preparing data for projected volume kube-api-access-zsxkc for pod openshift-network-diagnostics/network-check-target-x9fss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:08.977299 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:08.977293 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc podName:cb271ee0-fe50-4ec5-a58b-e4cde09671b7 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:09.477252855 +0000 UTC m=+3.071071896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zsxkc" (UniqueName: "kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc") pod "network-check-target-x9fss" (UID: "cb271ee0-fe50-4ec5-a58b-e4cde09671b7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:08.977506 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.977438 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99wnz\" (UniqueName: \"kubernetes.io/projected/2e10015e-64f6-4b90-b27b-5d53c810c05d-kube-api-access-99wnz\") pod \"multus-additional-cni-plugins-kfbrh\" (UID: \"2e10015e-64f6-4b90-b27b-5d53c810c05d\") " pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:08.977506 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.977448 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw2n4\" (UniqueName: \"kubernetes.io/projected/febe8a99-bbc0-4ad0-9eb4-512c729e11c3-kube-api-access-cw2n4\") pod \"ovnkube-node-qpw9h\" (UID: \"febe8a99-bbc0-4ad0-9eb4-512c729e11c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:08.979762 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.979742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l5cw\" (UniqueName: \"kubernetes.io/projected/aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d-kube-api-access-2l5cw\") pod \"tuned-d7grf\" (UID: \"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d\") " pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:08.980046 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.980019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6sxn\" (UniqueName: \"kubernetes.io/projected/2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af-kube-api-access-h6sxn\") pod \"iptables-alerter-dqx88\" (UID: \"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af\") " pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:08.980154 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.980094 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw6wl\" (UniqueName: \"kubernetes.io/projected/026bd687-3320-46f1-b7ea-f615e5b5a821-kube-api-access-rw6wl\") pod \"multus-t2gsq\" (UID: \"026bd687-3320-46f1-b7ea-f615e5b5a821\") " pod="openshift-multus/multus-t2gsq" Apr 20 14:55:08.980340 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:08.980318 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lvc6\" (UniqueName: \"kubernetes.io/projected/d1dafe36-2ae8-4593-82df-fbff4eee87b1-kube-api-access-8lvc6\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:09.070867 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.070829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwprn\" (UniqueName: \"kubernetes.io/projected/b535020a-3ebe-44bb-8180-63bb281aceff-kube-api-access-xwprn\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:09.071044 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.070887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b535020a-3ebe-44bb-8180-63bb281aceff-tmp-dir\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:09.071044 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.070932 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b535020a-3ebe-44bb-8180-63bb281aceff-hosts-file\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:09.071044 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.071032 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b535020a-3ebe-44bb-8180-63bb281aceff-hosts-file\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:09.071289 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.071270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b535020a-3ebe-44bb-8180-63bb281aceff-tmp-dir\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:09.081155 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.081125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwprn\" (UniqueName: \"kubernetes.io/projected/b535020a-3ebe-44bb-8180-63bb281aceff-kube-api-access-xwprn\") pod \"node-resolver-58fm4\" (UID: \"b535020a-3ebe-44bb-8180-63bb281aceff\") " pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:09.156116 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.156035 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gk5rl" Apr 20 14:55:09.165808 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.165776 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dqx88" Apr 20 14:55:09.172852 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.172828 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:09.173069 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.173041 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:09.179523 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.179504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t2gsq" Apr 20 14:55:09.186145 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.186119 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" Apr 20 14:55:09.190717 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.190699 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-d7grf" Apr 20 14:55:09.197266 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.197236 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" Apr 20 14:55:09.203810 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.203780 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-58fm4" Apr 20 14:55:09.209318 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.209295 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:09.473242 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.473146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:09.473401 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:09.473273 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:09.473401 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:09.473357 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs podName:d1dafe36-2ae8-4593-82df-fbff4eee87b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:10.473334823 +0000 UTC m=+4.067153868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs") pod "network-metrics-daemon-7v79q" (UID: "d1dafe36-2ae8-4593-82df-fbff4eee87b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:09.573595 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.573539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:09.573763 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:09.573699 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:09.573763 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:09.573726 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:09.573763 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:09.573740 2571 projected.go:194] Error preparing data for projected volume kube-api-access-zsxkc for pod openshift-network-diagnostics/network-check-target-x9fss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:09.573889 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:09.573827 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc podName:cb271ee0-fe50-4ec5-a58b-e4cde09671b7 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:10.573790732 +0000 UTC m=+4.167609774 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zsxkc" (UniqueName: "kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc") pod "network-check-target-x9fss" (UID: "cb271ee0-fe50-4ec5-a58b-e4cde09671b7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:09.667973 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:09.667842 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa0c5aaf_1c99_4a2e_b7e7_eebc3c8ba45d.slice/crio-a2d4d2e8dcc98065101c8e746b366247a5939babc7d5eb4424acca22a6bc3078 WatchSource:0}: Error finding container a2d4d2e8dcc98065101c8e746b366247a5939babc7d5eb4424acca22a6bc3078: Status 404 returned error can't find the container with id a2d4d2e8dcc98065101c8e746b366247a5939babc7d5eb4424acca22a6bc3078 Apr 20 14:55:09.670964 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:09.670864 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be4d4a0_b5c2_4857_a3ae_245ad4430c7c.slice/crio-bb0344fce2a6a8b726e21cf3bf3fdc5765271a6ae301c88ddd1e22f65e0bc341 WatchSource:0}: Error finding container bb0344fce2a6a8b726e21cf3bf3fdc5765271a6ae301c88ddd1e22f65e0bc341: Status 404 returned error can't find the container with id bb0344fce2a6a8b726e21cf3bf3fdc5765271a6ae301c88ddd1e22f65e0bc341 Apr 20 14:55:09.673415 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:09.673245 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e84f20_505c_41ef_8790_7da38a92ada4.slice/crio-708310ca0dfe086419122bcecd604a6727a653961245919a18adea7b410bc456 WatchSource:0}: Error finding container 708310ca0dfe086419122bcecd604a6727a653961245919a18adea7b410bc456: Status 404 returned error can't find the container with id 708310ca0dfe086419122bcecd604a6727a653961245919a18adea7b410bc456 Apr 20 14:55:09.673657 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:09.673636 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b9613e1_9a91_40ed_9ec1_4fcfa4ec06af.slice/crio-ae6796be009d6c90c2f2ba9aa54f4935f17ddb394e31a588f839b7c61e03b9c7 WatchSource:0}: Error finding container ae6796be009d6c90c2f2ba9aa54f4935f17ddb394e31a588f839b7c61e03b9c7: Status 404 returned error can't find the container with id ae6796be009d6c90c2f2ba9aa54f4935f17ddb394e31a588f839b7c61e03b9c7 Apr 20 14:55:09.675262 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:09.675201 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb535020a_3ebe_44bb_8180_63bb281aceff.slice/crio-d2957406a41d910b5aca2d4323d67f509e0d24b59fd6abd0b8852152178ad6e4 WatchSource:0}: Error finding container d2957406a41d910b5aca2d4323d67f509e0d24b59fd6abd0b8852152178ad6e4: Status 404 returned error can't find the container with id d2957406a41d910b5aca2d4323d67f509e0d24b59fd6abd0b8852152178ad6e4 Apr 20 14:55:09.681643 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:09.681620 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e10015e_64f6_4b90_b27b_5d53c810c05d.slice/crio-f406080ec31e46d992e5d523b590a6e5f4ee6f8e52dfaf40f12d2dd1684b9f92 WatchSource:0}: Error finding container f406080ec31e46d992e5d523b590a6e5f4ee6f8e52dfaf40f12d2dd1684b9f92: Status 404 returned error can't find the container with id f406080ec31e46d992e5d523b590a6e5f4ee6f8e52dfaf40f12d2dd1684b9f92 Apr 20 14:55:09.895635 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.895595 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:50:07 +0000 UTC" deadline="2027-12-04 21:54:15.568049062 +0000 UTC" Apr 20 14:55:09.895635 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.895629 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14238h59m5.672423516s" Apr 20 14:55:09.936227 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.936192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"00f4eb433bf6494b8df111f80459ea1bf71d8cc2252f409b64c3be5166a68822"} Apr 20 14:55:09.937226 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.937193 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" event={"ID":"2e10015e-64f6-4b90-b27b-5d53c810c05d","Type":"ContainerStarted","Data":"f406080ec31e46d992e5d523b590a6e5f4ee6f8e52dfaf40f12d2dd1684b9f92"} Apr 20 14:55:09.938220 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.938198 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v5dst" event={"ID":"0827ffc7-2165-4812-b9cf-29976d74ffc2","Type":"ContainerStarted","Data":"d80ca55a0833ec8c22d5532953b8cefcbf76ecbaa771acf874096bc456350ce6"} Apr 20 14:55:09.939019 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.938995 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-58fm4" event={"ID":"b535020a-3ebe-44bb-8180-63bb281aceff","Type":"ContainerStarted","Data":"d2957406a41d910b5aca2d4323d67f509e0d24b59fd6abd0b8852152178ad6e4"} Apr 20 14:55:09.939964 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.939935 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dqx88" event={"ID":"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af","Type":"ContainerStarted","Data":"ae6796be009d6c90c2f2ba9aa54f4935f17ddb394e31a588f839b7c61e03b9c7"} Apr 20 14:55:09.940942 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.940924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" event={"ID":"41e84f20-505c-41ef-8790-7da38a92ada4","Type":"ContainerStarted","Data":"708310ca0dfe086419122bcecd604a6727a653961245919a18adea7b410bc456"} Apr 20 14:55:09.942248 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.942210 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gk5rl" event={"ID":"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c","Type":"ContainerStarted","Data":"bb0344fce2a6a8b726e21cf3bf3fdc5765271a6ae301c88ddd1e22f65e0bc341"} Apr 20 14:55:09.943379 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.943334 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-d7grf" event={"ID":"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d","Type":"ContainerStarted","Data":"a2d4d2e8dcc98065101c8e746b366247a5939babc7d5eb4424acca22a6bc3078"} Apr 20 14:55:09.944890 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.944871 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" event={"ID":"e193ebdeb87d4b9c6bb9f329d9d23d3d","Type":"ContainerStarted","Data":"df73b5f6082fb33c9475e49b956c1e0bdba243bcff852393dc455f7df71f370e"} Apr 20 14:55:09.945818 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.945799 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t2gsq" event={"ID":"026bd687-3320-46f1-b7ea-f615e5b5a821","Type":"ContainerStarted","Data":"358e5e857adc1c3ed80cb86936dc1bd67fefb6566ca16d8be0c3373eaab17fc5"} Apr 20 14:55:09.962314 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:09.962277 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-9.ec2.internal" podStartSLOduration=1.9622654480000001 podStartE2EDuration="1.962265448s" podCreationTimestamp="2026-04-20 14:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:55:09.96222759 +0000 UTC m=+3.556046645" watchObservedRunningTime="2026-04-20 14:55:09.962265448 +0000 UTC m=+3.556084505" Apr 20 14:55:10.481522 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:10.480899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:10.481522 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:10.481084 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:10.481522 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:10.481146 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs podName:d1dafe36-2ae8-4593-82df-fbff4eee87b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:12.481128281 +0000 UTC m=+6.074947329 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs") pod "network-metrics-daemon-7v79q" (UID: "d1dafe36-2ae8-4593-82df-fbff4eee87b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:10.582586 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:10.582495 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:10.582757 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:10.582722 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:10.582757 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:10.582745 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:10.582757 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:10.582757 2571 projected.go:194] Error preparing data for projected volume kube-api-access-zsxkc for pod openshift-network-diagnostics/network-check-target-x9fss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:10.582913 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:10.582818 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc podName:cb271ee0-fe50-4ec5-a58b-e4cde09671b7 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:12.582798865 +0000 UTC m=+6.176617911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zsxkc" (UniqueName: "kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc") pod "network-check-target-x9fss" (UID: "cb271ee0-fe50-4ec5-a58b-e4cde09671b7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:10.932441 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:10.932132 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:10.932441 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:10.932267 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:10.932441 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:10.932132 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:10.932441 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:10.932367 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:10.953364 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:10.953328 2571 generic.go:358] "Generic (PLEG): container finished" podID="e40ef7a2f6c320642742d71b7138ad0e" containerID="bf911a2b96f5d1dfdfe231403501baec011c171996889b4f2224f93ebcf61512" exitCode=0 Apr 20 14:55:10.954233 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:10.954207 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" event={"ID":"e40ef7a2f6c320642742d71b7138ad0e","Type":"ContainerDied","Data":"bf911a2b96f5d1dfdfe231403501baec011c171996889b4f2224f93ebcf61512"} Apr 20 14:55:11.961865 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:11.960116 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" event={"ID":"e40ef7a2f6c320642742d71b7138ad0e","Type":"ContainerStarted","Data":"55a12da715947236ad582cf7d77919d88fe68b7aa569066e2082175987ed81dc"} Apr 20 14:55:11.974878 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:11.974830 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-9.ec2.internal" podStartSLOduration=3.974788539 podStartE2EDuration="3.974788539s" podCreationTimestamp="2026-04-20 14:55:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:55:11.974402296 +0000 UTC m=+5.568221361" watchObservedRunningTime="2026-04-20 14:55:11.974788539 +0000 UTC m=+5.568607604" Apr 20 14:55:12.503322 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:12.503198 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:12.503490 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:12.503383 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:12.503490 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:12.503450 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs podName:d1dafe36-2ae8-4593-82df-fbff4eee87b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:16.503433354 +0000 UTC m=+10.097252402 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs") pod "network-metrics-daemon-7v79q" (UID: "d1dafe36-2ae8-4593-82df-fbff4eee87b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:12.603814 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:12.603772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:12.604019 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:12.603930 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:12.604019 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:12.603949 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:12.604019 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:12.603961 2571 projected.go:194] Error preparing data for projected volume kube-api-access-zsxkc for pod openshift-network-diagnostics/network-check-target-x9fss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:12.604019 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:12.604019 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc podName:cb271ee0-fe50-4ec5-a58b-e4cde09671b7 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:16.604001594 +0000 UTC m=+10.197820653 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zsxkc" (UniqueName: "kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc") pod "network-check-target-x9fss" (UID: "cb271ee0-fe50-4ec5-a58b-e4cde09671b7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:12.927657 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:12.927574 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:12.927869 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:12.927728 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:12.928213 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:12.928195 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:12.928344 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:12.928309 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:14.928100 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:14.928063 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:14.928619 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:14.928229 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:14.928619 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:14.928293 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:14.928619 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:14.928380 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:16.534765 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:16.534148 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:16.534765 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:16.534341 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:16.534765 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:16.534409 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs podName:d1dafe36-2ae8-4593-82df-fbff4eee87b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:24.534390094 +0000 UTC m=+18.128209150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs") pod "network-metrics-daemon-7v79q" (UID: "d1dafe36-2ae8-4593-82df-fbff4eee87b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:16.636040 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:16.635377 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:16.636040 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:16.635562 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:16.636040 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:16.635581 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:16.636040 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:16.635594 2571 projected.go:194] Error preparing data for projected volume kube-api-access-zsxkc for pod openshift-network-diagnostics/network-check-target-x9fss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:16.636040 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:16.635651 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc podName:cb271ee0-fe50-4ec5-a58b-e4cde09671b7 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:24.635633812 +0000 UTC m=+18.229452856 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zsxkc" (UniqueName: "kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc") pod "network-check-target-x9fss" (UID: "cb271ee0-fe50-4ec5-a58b-e4cde09671b7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:16.932277 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:16.932201 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:16.932277 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:16.932219 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:16.932492 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:16.932359 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:16.932561 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:16.932481 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:17.130517 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.130484 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gd648"] Apr 20 14:55:17.133066 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.133040 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.133206 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:17.133126 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:17.241021 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.240929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78ea8d8a-7389-4822-92b4-41f9e8b474b9-dbus\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.241021 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.240979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78ea8d8a-7389-4822-92b4-41f9e8b474b9-kubelet-config\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.241198 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.241058 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.341434 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.341382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.341612 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.341473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78ea8d8a-7389-4822-92b4-41f9e8b474b9-dbus\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.341612 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:17.341507 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:17.341612 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:17.341582 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret podName:78ea8d8a-7389-4822-92b4-41f9e8b474b9 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:17.841562822 +0000 UTC m=+11.435381869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret") pod "global-pull-secret-syncer-gd648" (UID: "78ea8d8a-7389-4822-92b4-41f9e8b474b9") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:17.341612 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.341579 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78ea8d8a-7389-4822-92b4-41f9e8b474b9-kubelet-config\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.341612 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.341510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/78ea8d8a-7389-4822-92b4-41f9e8b474b9-kubelet-config\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.341908 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.341742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/78ea8d8a-7389-4822-92b4-41f9e8b474b9-dbus\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.847058 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:17.847024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:17.847530 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:17.847135 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:17.847530 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:17.847185 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret podName:78ea8d8a-7389-4822-92b4-41f9e8b474b9 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:18.847172384 +0000 UTC m=+12.440991425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret") pod "global-pull-secret-syncer-gd648" (UID: "78ea8d8a-7389-4822-92b4-41f9e8b474b9") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:18.854278 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:18.854176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:18.854806 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:18.854347 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:18.854806 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:18.854425 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret podName:78ea8d8a-7389-4822-92b4-41f9e8b474b9 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:20.854404932 +0000 UTC m=+14.448223988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret") pod "global-pull-secret-syncer-gd648" (UID: "78ea8d8a-7389-4822-92b4-41f9e8b474b9") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:18.931307 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:18.931211 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:18.931307 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:18.931222 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:18.931560 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:18.931222 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:18.931560 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:18.931426 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:18.931560 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:18.931526 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:18.931731 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:18.931600 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:20.869775 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:20.869738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:20.870244 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:20.869912 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:20.870244 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:20.869992 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret podName:78ea8d8a-7389-4822-92b4-41f9e8b474b9 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:24.869972825 +0000 UTC m=+18.463791879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret") pod "global-pull-secret-syncer-gd648" (UID: "78ea8d8a-7389-4822-92b4-41f9e8b474b9") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:20.928338 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:20.928305 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:20.928497 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:20.928311 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:20.928497 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:20.928432 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:20.928653 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:20.928532 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:20.928653 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:20.928321 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:20.928653 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:20.928635 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:22.927957 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:22.927920 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:22.927957 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:22.927938 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:22.927957 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:22.927930 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:22.928487 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:22.928053 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:22.928487 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:22.928113 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:22.928487 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:22.928163 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:24.595792 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:24.595749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:24.596293 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.595897 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:24.596293 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.595971 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs podName:d1dafe36-2ae8-4593-82df-fbff4eee87b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:40.595950325 +0000 UTC m=+34.189769366 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs") pod "network-metrics-daemon-7v79q" (UID: "d1dafe36-2ae8-4593-82df-fbff4eee87b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:24.696995 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:24.696951 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:24.697168 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.697118 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:24.697168 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.697142 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:24.697168 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.697155 2571 projected.go:194] Error preparing data for projected volume kube-api-access-zsxkc for pod openshift-network-diagnostics/network-check-target-x9fss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:24.697291 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.697208 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc podName:cb271ee0-fe50-4ec5-a58b-e4cde09671b7 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:40.697194389 +0000 UTC m=+34.291013430 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zsxkc" (UniqueName: "kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc") pod "network-check-target-x9fss" (UID: "cb271ee0-fe50-4ec5-a58b-e4cde09671b7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:24.898390 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:24.898300 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:24.898542 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.898467 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:24.898613 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.898544 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret podName:78ea8d8a-7389-4822-92b4-41f9e8b474b9 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:32.898525746 +0000 UTC m=+26.492344792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret") pod "global-pull-secret-syncer-gd648" (UID: "78ea8d8a-7389-4822-92b4-41f9e8b474b9") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:24.928001 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:24.927965 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:24.928177 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:24.927965 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:24.928177 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.928095 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:24.928261 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.928200 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:24.928261 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:24.927965 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:24.928329 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:24.928295 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:26.930089 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:26.929439 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:26.930089 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:26.929779 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:26.930839 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:26.930432 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:26.930839 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:26.930523 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:26.930839 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:26.930552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:26.930839 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:26.930647 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:26.990261 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:26.990216 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v5dst" event={"ID":"0827ffc7-2165-4812-b9cf-29976d74ffc2","Type":"ContainerStarted","Data":"ac98b5a9aea10f339997210e8fc415e2a4793a3df5cb0845a9b1ba85c0401e9a"} Apr 20 14:55:26.991887 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:26.991849 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" event={"ID":"41e84f20-505c-41ef-8790-7da38a92ada4","Type":"ContainerStarted","Data":"61fefced0974952d3572a8a288083e959c69a6cc0e28d7766a0e96b3be7b9430"} Apr 20 14:55:26.994011 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:26.993883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gk5rl" event={"ID":"7be4d4a0-b5c2-4857-a3ae-245ad4430c7c","Type":"ContainerStarted","Data":"fdefc5799e55bbf6ea2e31166dfd8434fd15d5253b609d2931fbda8ccc09d519"} Apr 20 14:55:26.995749 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:26.995473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-d7grf" event={"ID":"aa0c5aaf-1c99-4a2e-b7e7-eebc3c8ba45d","Type":"ContainerStarted","Data":"ad2cd42ab480b80897c9438a7eeed0b56c3abbea4376653253c4fdbecdc8e062"} Apr 20 14:55:27.003977 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:27.003938 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-v5dst" podStartSLOduration=3.095260763 podStartE2EDuration="20.003927922s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.678426932 +0000 UTC m=+3.272245980" lastFinishedPulling="2026-04-20 14:55:26.587094098 +0000 UTC m=+20.180913139" observedRunningTime="2026-04-20 14:55:27.003716067 +0000 UTC m=+20.597535132" watchObservedRunningTime="2026-04-20 14:55:27.003927922 +0000 UTC m=+20.597747015" Apr 20 14:55:27.031336 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:27.031299 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gk5rl" podStartSLOduration=3.123015496 podStartE2EDuration="20.031287615s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.672958615 +0000 UTC m=+3.266777666" lastFinishedPulling="2026-04-20 14:55:26.58123074 +0000 UTC m=+20.175049785" observedRunningTime="2026-04-20 14:55:27.017413515 +0000 UTC m=+20.611232578" watchObservedRunningTime="2026-04-20 14:55:27.031287615 +0000 UTC m=+20.625106674" Apr 20 14:55:27.031793 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:27.031766 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-d7grf" podStartSLOduration=3.120266259 podStartE2EDuration="20.031757679s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.66980183 +0000 UTC m=+3.263620871" lastFinishedPulling="2026-04-20 14:55:26.581293235 +0000 UTC m=+20.175112291" observedRunningTime="2026-04-20 14:55:27.031480077 +0000 UTC m=+20.625299141" watchObservedRunningTime="2026-04-20 14:55:27.031757679 +0000 UTC m=+20.625576741" Apr 20 14:55:27.998184 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:27.998151 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t2gsq" event={"ID":"026bd687-3320-46f1-b7ea-f615e5b5a821","Type":"ContainerStarted","Data":"cc9817a8a0cb18c2c435db10ba696f8089023c335dc8b79b2855765b3a46a60a"} Apr 20 14:55:27.999920 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:27.999898 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"a367b51c0f5aee342ae3ee5dc586ca20534c0d020fbf0bb317186f1fc5ecfaca"} Apr 20 14:55:28.000013 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:27.999925 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"e33e93023c2aee5528cc87d0750620304300a0a94f048983934c55090e244072"} Apr 20 14:55:28.000013 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:27.999934 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"83a02707d8b5ff58ee7bffbd7a27e40f40ef5649434cc97e4bebdb1b6f1da827"} Apr 20 14:55:28.001244 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.001215 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e10015e-64f6-4b90-b27b-5d53c810c05d" containerID="601ad8a01f6b5768cfd711cabccc5c8a0065763f49fd364289d8614295ce61cc" exitCode=0 Apr 20 14:55:28.001333 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.001287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" event={"ID":"2e10015e-64f6-4b90-b27b-5d53c810c05d","Type":"ContainerDied","Data":"601ad8a01f6b5768cfd711cabccc5c8a0065763f49fd364289d8614295ce61cc"} Apr 20 14:55:28.002474 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.002447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-58fm4" event={"ID":"b535020a-3ebe-44bb-8180-63bb281aceff","Type":"ContainerStarted","Data":"7d0410f049ae58e42e0eae80b349ac35fffcda1f40eee4c1960ffb799c3b34a7"} Apr 20 14:55:28.003699 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.003663 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dqx88" event={"ID":"2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af","Type":"ContainerStarted","Data":"82507686833cf09afd5423d8048eaa8c56e5465de8da8d94849a14ec284c1420"} Apr 20 14:55:28.015489 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.015451 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t2gsq" podStartSLOduration=3.8355787120000002 podStartE2EDuration="21.01543864s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.683610779 +0000 UTC m=+3.277429823" lastFinishedPulling="2026-04-20 14:55:26.863470696 +0000 UTC m=+20.457289751" observedRunningTime="2026-04-20 14:55:28.015353934 +0000 UTC m=+21.609173036" watchObservedRunningTime="2026-04-20 14:55:28.01543864 +0000 UTC m=+21.609257703" Apr 20 14:55:28.047489 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.047442 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-58fm4" podStartSLOduration=4.126561327 podStartE2EDuration="21.047429791s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.678542986 +0000 UTC m=+3.272362040" lastFinishedPulling="2026-04-20 14:55:26.599411458 +0000 UTC m=+20.193230504" observedRunningTime="2026-04-20 14:55:28.047326014 +0000 UTC m=+21.641145077" watchObservedRunningTime="2026-04-20 14:55:28.047429791 +0000 UTC m=+21.641248855" Apr 20 14:55:28.065653 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.065613 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dqx88" podStartSLOduration=4.1469189459999996 podStartE2EDuration="21.065601245s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.675508204 +0000 UTC m=+3.269327250" lastFinishedPulling="2026-04-20 14:55:26.594190502 +0000 UTC m=+20.188009549" observedRunningTime="2026-04-20 14:55:28.065189498 +0000 UTC m=+21.659008564" watchObservedRunningTime="2026-04-20 14:55:28.065601245 +0000 UTC m=+21.659420332" Apr 20 14:55:28.373621 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.373480 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:55:28.928270 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.928164 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:55:28.37361889Z","UUID":"adbcb22b-7672-4e06-b3ce-43c7f656231c","Handler":null,"Name":"","Endpoint":""} Apr 20 14:55:28.928270 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.928259 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:28.928720 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.928306 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:28.928720 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:28.928381 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:28.928720 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.928452 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:28.928720 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:28.928536 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:28.928720 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:28.928624 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:28.931348 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.931327 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:55:28.931488 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:28.931357 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:55:29.008039 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:29.008001 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" event={"ID":"41e84f20-505c-41ef-8790-7da38a92ada4","Type":"ContainerStarted","Data":"f068d8522769a349ac6418e77dc3029034446f82b94259e0069ee88cc4fccf55"} Apr 20 14:55:29.011424 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:29.011392 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"7ddef17ba3588ad86a7a6066993913d84ded17bb1e1bd11030409cad35f148b3"} Apr 20 14:55:29.011424 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:29.011431 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"8d0732870642abd61ec2e56e766b29093da09171ade104ebac29cae29d9aaf2c"} Apr 20 14:55:29.011635 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:29.011445 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"4097672429d8058c14be4036644f462fde600e1398c90fd035f7df0602a67884"} Apr 20 14:55:30.015187 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:30.015152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" event={"ID":"41e84f20-505c-41ef-8790-7da38a92ada4","Type":"ContainerStarted","Data":"4b9583df1b4e0ee59aa39a278172714e3897f9759ca7f4505bf206211cda76c8"} Apr 20 14:55:30.031627 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:30.031574 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vs87s" podStartSLOduration=3.260970048 podStartE2EDuration="23.031557537s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.675861666 +0000 UTC m=+3.269680738" lastFinishedPulling="2026-04-20 14:55:29.446449175 +0000 UTC m=+23.040268227" observedRunningTime="2026-04-20 14:55:30.031118494 +0000 UTC m=+23.624937559" watchObservedRunningTime="2026-04-20 14:55:30.031557537 +0000 UTC m=+23.625376602" Apr 20 14:55:30.236023 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:30.235992 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:30.236745 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:30.236721 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:30.927614 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:30.927575 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:30.927823 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:30.927576 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:30.927823 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:30.927740 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:30.927823 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:30.927575 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:30.927823 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:30.927792 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:30.928052 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:30.927907 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:31.020916 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:31.020879 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"37399e80137cf12bf6534c37da0ba852e18c97ba9dad2e1da34c21825e3838e2"} Apr 20 14:55:31.021586 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:31.021305 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:31.021988 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:31.021968 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-v5dst" Apr 20 14:55:32.928544 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:32.928313 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:32.929003 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:32.928311 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:32.929003 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:32.928582 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:32.929003 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:32.928313 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:32.929003 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:32.928659 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:32.929003 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:32.928733 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:32.959253 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:32.959228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:32.959373 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:32.959321 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:32.959444 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:32.959374 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret podName:78ea8d8a-7389-4822-92b4-41f9e8b474b9 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:48.959360862 +0000 UTC m=+42.553179908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret") pod "global-pull-secret-syncer-gd648" (UID: "78ea8d8a-7389-4822-92b4-41f9e8b474b9") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:33.027396 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:33.027365 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" event={"ID":"febe8a99-bbc0-4ad0-9eb4-512c729e11c3","Type":"ContainerStarted","Data":"ba08e0541cc581bdbb3fa1d09cf0f44603c710687db7a1883b7ff376799cdcbd"} Apr 20 14:55:33.027668 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:33.027642 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:33.027789 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:33.027674 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:33.029030 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:33.029009 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e10015e-64f6-4b90-b27b-5d53c810c05d" containerID="d8bd904fdfa43f15999dcdc71acdaf8d8d0c73127d3951af6d05caa958e9a3fd" exitCode=0 Apr 20 14:55:33.029130 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:33.029105 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" event={"ID":"2e10015e-64f6-4b90-b27b-5d53c810c05d","Type":"ContainerDied","Data":"d8bd904fdfa43f15999dcdc71acdaf8d8d0c73127d3951af6d05caa958e9a3fd"} Apr 20 14:55:33.043280 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:33.043260 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:33.060437 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:33.060392 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" podStartSLOduration=8.366016711 podStartE2EDuration="26.060376218s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.680344066 +0000 UTC m=+3.274163107" lastFinishedPulling="2026-04-20 14:55:27.374703558 +0000 UTC m=+20.968522614" observedRunningTime="2026-04-20 14:55:33.057327159 +0000 UTC m=+26.651146224" watchObservedRunningTime="2026-04-20 14:55:33.060376218 +0000 UTC m=+26.654195285" Apr 20 14:55:34.031254 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:34.031218 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:34.045024 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:34.044999 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:55:34.542326 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:34.542249 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x9fss"] Apr 20 14:55:34.542464 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:34.542376 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:34.542508 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:34.542460 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:34.545715 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:34.545673 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gd648"] Apr 20 14:55:34.545843 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:34.545794 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:34.545941 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:34.545884 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:34.546255 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:34.546233 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7v79q"] Apr 20 14:55:34.546344 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:34.546331 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:34.546443 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:34.546420 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:35.034305 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:35.034225 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e10015e-64f6-4b90-b27b-5d53c810c05d" containerID="47a46252b876ffe5788050b6cca3aedd4d17436ef43e64c751f1db538f9c7fb5" exitCode=0 Apr 20 14:55:35.034697 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:35.034309 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" event={"ID":"2e10015e-64f6-4b90-b27b-5d53c810c05d","Type":"ContainerDied","Data":"47a46252b876ffe5788050b6cca3aedd4d17436ef43e64c751f1db538f9c7fb5"} Apr 20 14:55:35.928205 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:35.928172 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:35.928205 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:35.928200 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:35.928484 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:35.928281 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:35.928484 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:35.928313 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:35.928484 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:35.928379 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:35.928484 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:35.928429 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:37.039574 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:37.039298 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e10015e-64f6-4b90-b27b-5d53c810c05d" containerID="f6e6fd0c45a51da12ff462583ad67771ff16b195bc479c312f0b67a6abce21de" exitCode=0 Apr 20 14:55:37.039574 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:37.039380 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" event={"ID":"2e10015e-64f6-4b90-b27b-5d53c810c05d","Type":"ContainerDied","Data":"f6e6fd0c45a51da12ff462583ad67771ff16b195bc479c312f0b67a6abce21de"} Apr 20 14:55:37.927692 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:37.927657 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:37.927930 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:37.927733 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:37.927930 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:37.927836 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:37.927930 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:37.927850 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:37.928142 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:37.927950 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:37.928142 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:37.928038 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:39.928444 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:39.928405 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:39.928444 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:39.928422 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:39.929148 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:39.928541 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gd648" podUID="78ea8d8a-7389-4822-92b4-41f9e8b474b9" Apr 20 14:55:39.929148 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:39.928564 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:39.929148 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:39.928662 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7v79q" podUID="d1dafe36-2ae8-4593-82df-fbff4eee87b1" Apr 20 14:55:39.929148 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:39.928744 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x9fss" podUID="cb271ee0-fe50-4ec5-a58b-e4cde09671b7" Apr 20 14:55:40.622793 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.622758 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:40.622961 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:40.622897 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:40.623028 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:40.622975 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs podName:d1dafe36-2ae8-4593-82df-fbff4eee87b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.622953879 +0000 UTC m=+66.216772933 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs") pod "network-metrics-daemon-7v79q" (UID: "d1dafe36-2ae8-4593-82df-fbff4eee87b1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:40.723589 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.723496 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:40.723754 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:40.723672 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:40.723754 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:40.723715 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:40.723754 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:40.723733 2571 projected.go:194] Error preparing data for projected volume kube-api-access-zsxkc for pod openshift-network-diagnostics/network-check-target-x9fss: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:40.723913 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:40.723796 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc podName:cb271ee0-fe50-4ec5-a58b-e4cde09671b7 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.723777416 +0000 UTC m=+66.317596459 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zsxkc" (UniqueName: "kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc") pod "network-check-target-x9fss" (UID: "cb271ee0-fe50-4ec5-a58b-e4cde09671b7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:40.780172 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.780141 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-9.ec2.internal" event="NodeReady" Apr 20 14:55:40.780341 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.780282 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:55:40.815131 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.815102 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-665f5fd44d-xtws4"] Apr 20 14:55:40.848098 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.848066 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8vlv5"] Apr 20 14:55:40.848267 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.848242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:40.851090 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.850922 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 14:55:40.851090 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.850937 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fgmdb\"" Apr 20 14:55:40.851090 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.850968 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 14:55:40.851090 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.850941 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 14:55:40.866420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.866387 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 14:55:40.866983 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.866954 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ktxxq"] Apr 20 14:55:40.867123 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.867107 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:40.869929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.869908 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tr5vz\"" Apr 20 14:55:40.870058 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.869942 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:55:40.870120 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.870066 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:55:40.891642 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.891618 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-665f5fd44d-xtws4"] Apr 20 14:55:40.891800 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.891650 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8vlv5"] Apr 20 14:55:40.891800 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.891665 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ktxxq"] Apr 20 14:55:40.891913 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.891798 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:40.894153 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.894133 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:55:40.894280 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.894165 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:55:40.894620 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.894595 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:55:40.894732 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:40.894630 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xg95b\"" Apr 20 14:55:41.026780 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026702 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjqk\" (UniqueName: \"kubernetes.io/projected/e0053f0d-ea66-4e0b-950d-23c42c995f23-kube-api-access-czjqk\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:41.026780 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026759 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-image-registry-private-configuration\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026794 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026843 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-certificates\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6vzw\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-kube-api-access-z6vzw\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f51becfe-6707-48a0-8930-b1feea33fb21-config-volume\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026925 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f51becfe-6707-48a0-8930-b1feea33fb21-tmp-dir\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026962 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1326961-f2f6-425b-9b48-4892eb2c7d31-ca-trust-extracted\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.026982 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.027001 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-trusted-ca\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.027039 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-installation-pull-secrets\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.027062 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz5bt\" (UniqueName: \"kubernetes.io/projected/f51becfe-6707-48a0-8930-b1feea33fb21-kube-api-access-xz5bt\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.027343 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.027156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-bound-sa-token\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128027 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.127987 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f51becfe-6707-48a0-8930-b1feea33fb21-config-volume\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.128027 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f51becfe-6707-48a0-8930-b1feea33fb21-tmp-dir\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1326961-f2f6-425b-9b48-4892eb2c7d31-ca-trust-extracted\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128093 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-trusted-ca\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128146 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-installation-pull-secrets\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz5bt\" (UniqueName: \"kubernetes.io/projected/f51becfe-6707-48a0-8930-b1feea33fb21-kube-api-access-xz5bt\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128201 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-bound-sa-token\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czjqk\" (UniqueName: \"kubernetes.io/projected/e0053f0d-ea66-4e0b-950d-23c42c995f23-kube-api-access-czjqk\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.128235 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.128252 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665f5fd44d-xtws4: secret "image-registry-tls" not found Apr 20 14:55:41.128274 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-image-registry-private-configuration\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.128323 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls podName:b1326961-f2f6-425b-9b48-4892eb2c7d31 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:41.628304282 +0000 UTC m=+35.222123340 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls") pod "image-registry-665f5fd44d-xtws4" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31") : secret "image-registry-tls" not found Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.128373 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128374 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.128418 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls podName:f51becfe-6707-48a0-8930-b1feea33fb21 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:41.628406677 +0000 UTC m=+35.222225734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls") pod "dns-default-8vlv5" (UID: "f51becfe-6707-48a0-8930-b1feea33fb21") : secret "dns-default-metrics-tls" not found Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-certificates\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.128456 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6vzw\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-kube-api-access-z6vzw\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.128485 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert podName:e0053f0d-ea66-4e0b-950d-23c42c995f23 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:41.628475459 +0000 UTC m=+35.222294504 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert") pod "ingress-canary-ktxxq" (UID: "e0053f0d-ea66-4e0b-950d-23c42c995f23") : secret "canary-serving-cert" not found Apr 20 14:55:41.128791 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128502 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1326961-f2f6-425b-9b48-4892eb2c7d31-ca-trust-extracted\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.129300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.128993 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f51becfe-6707-48a0-8930-b1feea33fb21-tmp-dir\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.129300 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.129172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-certificates\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.129401 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.129351 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f51becfe-6707-48a0-8930-b1feea33fb21-config-volume\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.129795 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.129605 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-trusted-ca\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.133774 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.133748 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-image-registry-private-configuration\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.133774 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.133770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-installation-pull-secrets\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.137305 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.137189 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6vzw\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-kube-api-access-z6vzw\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.137305 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.137234 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-bound-sa-token\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.137521 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.137405 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz5bt\" (UniqueName: \"kubernetes.io/projected/f51becfe-6707-48a0-8930-b1feea33fb21-kube-api-access-xz5bt\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.137805 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.137782 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjqk\" (UniqueName: \"kubernetes.io/projected/e0053f0d-ea66-4e0b-950d-23c42c995f23-kube-api-access-czjqk\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:41.633386 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.633350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:41.633574 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.633419 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:41.633574 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.633473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:41.633574 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.633524 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:41.633574 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.633566 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:41.633808 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.633573 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:41.633808 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.633593 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665f5fd44d-xtws4: secret "image-registry-tls" not found Apr 20 14:55:41.633808 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.633612 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert podName:e0053f0d-ea66-4e0b-950d-23c42c995f23 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:42.633594556 +0000 UTC m=+36.227413606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert") pod "ingress-canary-ktxxq" (UID: "e0053f0d-ea66-4e0b-950d-23c42c995f23") : secret "canary-serving-cert" not found Apr 20 14:55:41.633808 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.633644 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls podName:b1326961-f2f6-425b-9b48-4892eb2c7d31 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:42.63362449 +0000 UTC m=+36.227443554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls") pod "image-registry-665f5fd44d-xtws4" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31") : secret "image-registry-tls" not found Apr 20 14:55:41.633808 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:41.633664 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls podName:f51becfe-6707-48a0-8930-b1feea33fb21 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:42.633654748 +0000 UTC m=+36.227473795 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls") pod "dns-default-8vlv5" (UID: "f51becfe-6707-48a0-8930-b1feea33fb21") : secret "dns-default-metrics-tls" not found Apr 20 14:55:41.928016 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.927940 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:55:41.928016 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.927949 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:41.928016 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.927946 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:55:41.931544 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.931514 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:55:41.931544 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.931531 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:55:41.931764 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.931513 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6rqtq\"" Apr 20 14:55:41.931764 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.931514 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6x78v\"" Apr 20 14:55:41.931764 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.931520 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:55:41.931764 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:41.931524 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:55:42.642240 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:42.642207 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:42.642240 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:42.642255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:42.642939 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:42.642309 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:42.642939 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:42.642388 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:42.642939 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:42.642393 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:42.642939 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:42.642468 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:42.642939 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:42.642484 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665f5fd44d-xtws4: secret "image-registry-tls" not found Apr 20 14:55:42.642939 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:42.642470 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls podName:f51becfe-6707-48a0-8930-b1feea33fb21 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:44.642448368 +0000 UTC m=+38.236267430 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls") pod "dns-default-8vlv5" (UID: "f51becfe-6707-48a0-8930-b1feea33fb21") : secret "dns-default-metrics-tls" not found Apr 20 14:55:42.642939 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:42.642586 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert podName:e0053f0d-ea66-4e0b-950d-23c42c995f23 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:44.642568953 +0000 UTC m=+38.236387997 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert") pod "ingress-canary-ktxxq" (UID: "e0053f0d-ea66-4e0b-950d-23c42c995f23") : secret "canary-serving-cert" not found Apr 20 14:55:42.642939 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:42.642608 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls podName:b1326961-f2f6-425b-9b48-4892eb2c7d31 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:44.6426008 +0000 UTC m=+38.236419840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls") pod "image-registry-665f5fd44d-xtws4" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31") : secret "image-registry-tls" not found Apr 20 14:55:43.761738 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.761540 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld"] Apr 20 14:55:43.765037 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.765010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" Apr 20 14:55:43.768006 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.767980 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 14:55:43.768130 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.768011 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-g8hzx\"" Apr 20 14:55:43.768130 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.768051 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 14:55:43.768130 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.767990 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 14:55:43.768130 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.767981 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 14:55:43.773697 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.773662 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld"] Apr 20 14:55:43.798022 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.797994 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449"] Apr 20 14:55:43.804579 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.804552 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:43.807022 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.806995 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 14:55:43.810210 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.810182 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r"] Apr 20 14:55:43.817354 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.817323 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449"] Apr 20 14:55:43.817446 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.817423 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:43.819905 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.819873 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 14:55:43.819996 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.819958 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 14:55:43.820065 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.819996 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 14:55:43.820065 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.820001 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 14:55:43.824573 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.824548 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r"] Apr 20 14:55:43.951759 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.951716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-ca\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:43.951943 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.951787 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjfmq\" (UniqueName: \"kubernetes.io/projected/fc5fd5bf-9901-4896-aadd-b4deee5ccfe0-kube-api-access-fjfmq\") pod \"managed-serviceaccount-addon-agent-5cd8754f67-bm7ld\" (UID: \"fc5fd5bf-9901-4896-aadd-b4deee5ccfe0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" Apr 20 14:55:43.951943 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.951814 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:43.951943 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.951838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/74a04e11-0a0e-437a-9ca8-d8e402ebb666-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:43.951943 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.951856 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f295e822-750b-4a52-86c5-c07b68326988-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:43.951943 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.951914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f295e822-750b-4a52-86c5-c07b68326988-tmp\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:43.952105 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.951962 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkg22\" (UniqueName: \"kubernetes.io/projected/74a04e11-0a0e-437a-9ca8-d8e402ebb666-kube-api-access-xkg22\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:43.952105 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.952030 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fc5fd5bf-9901-4896-aadd-b4deee5ccfe0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cd8754f67-bm7ld\" (UID: \"fc5fd5bf-9901-4896-aadd-b4deee5ccfe0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" Apr 20 14:55:43.952105 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.952078 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/f295e822-750b-4a52-86c5-c07b68326988-kube-api-access-lw4wh\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:43.952190 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.952109 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-hub\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:43.952190 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:43.952125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.052420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-ca\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.052606 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjfmq\" (UniqueName: \"kubernetes.io/projected/fc5fd5bf-9901-4896-aadd-b4deee5ccfe0-kube-api-access-fjfmq\") pod \"managed-serviceaccount-addon-agent-5cd8754f67-bm7ld\" (UID: \"fc5fd5bf-9901-4896-aadd-b4deee5ccfe0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" Apr 20 14:55:44.052606 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.052606 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/74a04e11-0a0e-437a-9ca8-d8e402ebb666-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.052606 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052512 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f295e822-750b-4a52-86c5-c07b68326988-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:44.052606 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f295e822-750b-4a52-86c5-c07b68326988-tmp\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:44.052929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkg22\" (UniqueName: \"kubernetes.io/projected/74a04e11-0a0e-437a-9ca8-d8e402ebb666-kube-api-access-xkg22\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.052929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fc5fd5bf-9901-4896-aadd-b4deee5ccfe0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cd8754f67-bm7ld\" (UID: \"fc5fd5bf-9901-4896-aadd-b4deee5ccfe0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" Apr 20 14:55:44.052929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/f295e822-750b-4a52-86c5-c07b68326988-kube-api-access-lw4wh\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:44.052929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052834 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-hub\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.052929 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052860 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.053178 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.052971 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f295e822-750b-4a52-86c5-c07b68326988-tmp\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:44.053391 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.053342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/74a04e11-0a0e-437a-9ca8-d8e402ebb666-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.056441 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.056409 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-ca\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.056441 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.056436 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f295e822-750b-4a52-86c5-c07b68326988-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:44.056617 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.056412 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-hub\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.056617 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.056562 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fc5fd5bf-9901-4896-aadd-b4deee5ccfe0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cd8754f67-bm7ld\" (UID: \"fc5fd5bf-9901-4896-aadd-b4deee5ccfe0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" Apr 20 14:55:44.056617 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.056590 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.056815 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.056666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/74a04e11-0a0e-437a-9ca8-d8e402ebb666-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.058615 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.058585 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e10015e-64f6-4b90-b27b-5d53c810c05d" containerID="3e828255b85d5ffbd9c67acb01fd7ffbbb24fa474cbbf0ee00b48801f51a9484" exitCode=0 Apr 20 14:55:44.058750 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.058625 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" event={"ID":"2e10015e-64f6-4b90-b27b-5d53c810c05d","Type":"ContainerDied","Data":"3e828255b85d5ffbd9c67acb01fd7ffbbb24fa474cbbf0ee00b48801f51a9484"} Apr 20 14:55:44.060583 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.060552 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4wh\" (UniqueName: \"kubernetes.io/projected/f295e822-750b-4a52-86c5-c07b68326988-kube-api-access-lw4wh\") pod \"klusterlet-addon-workmgr-5c4c6fd5c5-k6449\" (UID: \"f295e822-750b-4a52-86c5-c07b68326988\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:44.060702 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.060592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjfmq\" (UniqueName: \"kubernetes.io/projected/fc5fd5bf-9901-4896-aadd-b4deee5ccfe0-kube-api-access-fjfmq\") pod \"managed-serviceaccount-addon-agent-5cd8754f67-bm7ld\" (UID: \"fc5fd5bf-9901-4896-aadd-b4deee5ccfe0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" Apr 20 14:55:44.060702 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.060617 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkg22\" (UniqueName: \"kubernetes.io/projected/74a04e11-0a0e-437a-9ca8-d8e402ebb666-kube-api-access-xkg22\") pod \"cluster-proxy-proxy-agent-65d87bb68b-td26r\" (UID: \"74a04e11-0a0e-437a-9ca8-d8e402ebb666\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.087971 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.087946 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" Apr 20 14:55:44.114520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.114499 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:44.127316 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.127287 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:55:44.277520 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.274900 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld"] Apr 20 14:55:44.283191 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:44.283028 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc5fd5bf_9901_4896_aadd_b4deee5ccfe0.slice/crio-1f42961dde94d96cd77a64569c82871e33574072f08a94705093d2cfcdd267fc WatchSource:0}: Error finding container 1f42961dde94d96cd77a64569c82871e33574072f08a94705093d2cfcdd267fc: Status 404 returned error can't find the container with id 1f42961dde94d96cd77a64569c82871e33574072f08a94705093d2cfcdd267fc Apr 20 14:55:44.284420 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.284219 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449"] Apr 20 14:55:44.312764 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.312672 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r"] Apr 20 14:55:44.315268 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:44.315246 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf295e822_750b_4a52_86c5_c07b68326988.slice/crio-c4f07ccbb51c529571808e0a11c793f591fb5c8374ec49b364a9cf1ba283e5c4 WatchSource:0}: Error finding container c4f07ccbb51c529571808e0a11c793f591fb5c8374ec49b364a9cf1ba283e5c4: Status 404 returned error can't find the container with id c4f07ccbb51c529571808e0a11c793f591fb5c8374ec49b364a9cf1ba283e5c4 Apr 20 14:55:44.315779 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:44.315758 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74a04e11_0a0e_437a_9ca8_d8e402ebb666.slice/crio-778d7888f05d74f350cb060dd47271ac2ad34cd5371bbe08c2fe16a81d1b4dce WatchSource:0}: Error finding container 778d7888f05d74f350cb060dd47271ac2ad34cd5371bbe08c2fe16a81d1b4dce: Status 404 returned error can't find the container with id 778d7888f05d74f350cb060dd47271ac2ad34cd5371bbe08c2fe16a81d1b4dce Apr 20 14:55:44.658208 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.658120 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:44.658208 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.658185 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:44.658437 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:44.658247 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:44.658437 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:44.658272 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:44.658437 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:44.658327 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:44.658437 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:44.658342 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665f5fd44d-xtws4: secret "image-registry-tls" not found Apr 20 14:55:44.658437 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:44.658346 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:44.658437 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:44.658348 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert podName:e0053f0d-ea66-4e0b-950d-23c42c995f23 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:48.658324217 +0000 UTC m=+42.252143280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert") pod "ingress-canary-ktxxq" (UID: "e0053f0d-ea66-4e0b-950d-23c42c995f23") : secret "canary-serving-cert" not found Apr 20 14:55:44.658437 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:44.658394 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls podName:f51becfe-6707-48a0-8930-b1feea33fb21 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:48.658379951 +0000 UTC m=+42.252198995 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls") pod "dns-default-8vlv5" (UID: "f51becfe-6707-48a0-8930-b1feea33fb21") : secret "dns-default-metrics-tls" not found Apr 20 14:55:44.658437 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:44.658412 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls podName:b1326961-f2f6-425b-9b48-4892eb2c7d31 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:48.658402847 +0000 UTC m=+42.252221894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls") pod "image-registry-665f5fd44d-xtws4" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31") : secret "image-registry-tls" not found Apr 20 14:55:45.063187 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:45.063129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" event={"ID":"f295e822-750b-4a52-86c5-c07b68326988","Type":"ContainerStarted","Data":"c4f07ccbb51c529571808e0a11c793f591fb5c8374ec49b364a9cf1ba283e5c4"} Apr 20 14:55:45.064746 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:45.064706 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" event={"ID":"fc5fd5bf-9901-4896-aadd-b4deee5ccfe0","Type":"ContainerStarted","Data":"1f42961dde94d96cd77a64569c82871e33574072f08a94705093d2cfcdd267fc"} Apr 20 14:55:45.069303 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:45.069274 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e10015e-64f6-4b90-b27b-5d53c810c05d" containerID="b52194dfe1953c2d80e3e331a3632a80bba1eaa81ae8a896837c1f965bc341f6" exitCode=0 Apr 20 14:55:45.069427 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:45.069378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" event={"ID":"2e10015e-64f6-4b90-b27b-5d53c810c05d","Type":"ContainerDied","Data":"b52194dfe1953c2d80e3e331a3632a80bba1eaa81ae8a896837c1f965bc341f6"} Apr 20 14:55:45.075655 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:45.075625 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" event={"ID":"74a04e11-0a0e-437a-9ca8-d8e402ebb666","Type":"ContainerStarted","Data":"778d7888f05d74f350cb060dd47271ac2ad34cd5371bbe08c2fe16a81d1b4dce"} Apr 20 14:55:46.086576 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:46.085969 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" event={"ID":"2e10015e-64f6-4b90-b27b-5d53c810c05d","Type":"ContainerStarted","Data":"032e0aa9786aecf40e15225d84184e3ed044bfed74252501ab83d1795fb296e4"} Apr 20 14:55:46.110790 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:46.110447 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kfbrh" podStartSLOduration=5.8743358610000005 podStartE2EDuration="39.11042934s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:55:09.684574962 +0000 UTC m=+3.278394017" lastFinishedPulling="2026-04-20 14:55:42.92066845 +0000 UTC m=+36.514487496" observedRunningTime="2026-04-20 14:55:46.108171654 +0000 UTC m=+39.701990722" watchObservedRunningTime="2026-04-20 14:55:46.11042934 +0000 UTC m=+39.704248404" Apr 20 14:55:48.694670 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:48.694632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:48.694715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:48.694764 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:48.694796 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:48.694818 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665f5fd44d-xtws4: secret "image-registry-tls" not found Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:48.694841 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:48.694873 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls podName:b1326961-f2f6-425b-9b48-4892eb2c7d31 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:56.694856108 +0000 UTC m=+50.288675148 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls") pod "image-registry-665f5fd44d-xtws4" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31") : secret "image-registry-tls" not found Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:48.694906 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls podName:f51becfe-6707-48a0-8930-b1feea33fb21 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:56.69488779 +0000 UTC m=+50.288706863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls") pod "dns-default-8vlv5" (UID: "f51becfe-6707-48a0-8930-b1feea33fb21") : secret "dns-default-metrics-tls" not found Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:48.694849 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:48.695036 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:48.694948 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert podName:e0053f0d-ea66-4e0b-950d-23c42c995f23 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:56.694937862 +0000 UTC m=+50.288756904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert") pod "ingress-canary-ktxxq" (UID: "e0053f0d-ea66-4e0b-950d-23c42c995f23") : secret "canary-serving-cert" not found Apr 20 14:55:48.997096 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:48.997007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:49.000863 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:49.000829 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/78ea8d8a-7389-4822-92b4-41f9e8b474b9-original-pull-secret\") pod \"global-pull-secret-syncer-gd648\" (UID: \"78ea8d8a-7389-4822-92b4-41f9e8b474b9\") " pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:49.152219 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:49.152180 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gd648" Apr 20 14:55:50.577907 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:50.577882 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gd648"] Apr 20 14:55:50.581874 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:55:50.581850 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78ea8d8a_7389_4822_92b4_41f9e8b474b9.slice/crio-d4edb138399e74a387404ad4abb36813e5539625d14672b0faa9dcd4d0447ce4 WatchSource:0}: Error finding container d4edb138399e74a387404ad4abb36813e5539625d14672b0faa9dcd4d0447ce4: Status 404 returned error can't find the container with id d4edb138399e74a387404ad4abb36813e5539625d14672b0faa9dcd4d0447ce4 Apr 20 14:55:51.097943 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:51.097910 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gd648" event={"ID":"78ea8d8a-7389-4822-92b4-41f9e8b474b9","Type":"ContainerStarted","Data":"d4edb138399e74a387404ad4abb36813e5539625d14672b0faa9dcd4d0447ce4"} Apr 20 14:55:51.099163 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:51.099140 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" event={"ID":"74a04e11-0a0e-437a-9ca8-d8e402ebb666","Type":"ContainerStarted","Data":"0ac2398b432fade9b9a348a88901902eacb713180e921066f272c669ba17511c"} Apr 20 14:55:51.100362 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:51.100339 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" event={"ID":"f295e822-750b-4a52-86c5-c07b68326988","Type":"ContainerStarted","Data":"a93086843effe8cb9ca160e0522b3708eeb8ff3bab27e369b5fa62cd70e64ac9"} Apr 20 14:55:51.100545 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:51.100528 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:51.101561 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:51.101541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" event={"ID":"fc5fd5bf-9901-4896-aadd-b4deee5ccfe0","Type":"ContainerStarted","Data":"241a7ff224e2ddd96cbf649bf8159a7a1e947db66fc08949122cdbe21fe18463"} Apr 20 14:55:51.102202 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:51.102187 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" Apr 20 14:55:51.130674 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:51.130635 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4c6fd5c5-k6449" podStartSLOduration=1.9764347820000001 podStartE2EDuration="8.130624563s" podCreationTimestamp="2026-04-20 14:55:43 +0000 UTC" firstStartedPulling="2026-04-20 14:55:44.318736699 +0000 UTC m=+37.912555743" lastFinishedPulling="2026-04-20 14:55:50.472926465 +0000 UTC m=+44.066745524" observedRunningTime="2026-04-20 14:55:51.115704074 +0000 UTC m=+44.709523136" watchObservedRunningTime="2026-04-20 14:55:51.130624563 +0000 UTC m=+44.724443626" Apr 20 14:55:51.145123 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:51.145079 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd8754f67-bm7ld" podStartSLOduration=1.975246178 podStartE2EDuration="8.145065462s" podCreationTimestamp="2026-04-20 14:55:43 +0000 UTC" firstStartedPulling="2026-04-20 14:55:44.289943841 +0000 UTC m=+37.883762900" lastFinishedPulling="2026-04-20 14:55:50.459763132 +0000 UTC m=+44.053582184" observedRunningTime="2026-04-20 14:55:51.144447005 +0000 UTC m=+44.738266069" watchObservedRunningTime="2026-04-20 14:55:51.145065462 +0000 UTC m=+44.738884525" Apr 20 14:55:54.110339 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:54.110295 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" event={"ID":"74a04e11-0a0e-437a-9ca8-d8e402ebb666","Type":"ContainerStarted","Data":"cededa19626d1768f044ea287fa6d94849da35a864a5105919377320267de168"} Apr 20 14:55:54.110339 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:54.110336 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" event={"ID":"74a04e11-0a0e-437a-9ca8-d8e402ebb666","Type":"ContainerStarted","Data":"d54b1ccd43df8a59c99eb9cb47db9c275801190c43c777212d5b705c907f9cdd"} Apr 20 14:55:54.130264 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:54.130209 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" podStartSLOduration=2.276750951 podStartE2EDuration="11.130190926s" podCreationTimestamp="2026-04-20 14:55:43 +0000 UTC" firstStartedPulling="2026-04-20 14:55:44.319149922 +0000 UTC m=+37.912968964" lastFinishedPulling="2026-04-20 14:55:53.17258989 +0000 UTC m=+46.766408939" observedRunningTime="2026-04-20 14:55:54.12834825 +0000 UTC m=+47.722167324" watchObservedRunningTime="2026-04-20 14:55:54.130190926 +0000 UTC m=+47.724010005" Apr 20 14:55:55.118756 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:55.118717 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gd648" event={"ID":"78ea8d8a-7389-4822-92b4-41f9e8b474b9","Type":"ContainerStarted","Data":"6df20c0f134cf4b8a880f6e109711967763e3fb405d74b8f137e5bbd1b938b92"} Apr 20 14:55:55.132738 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:55.132650 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gd648" podStartSLOduration=33.77516729 podStartE2EDuration="38.132638866s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:55:50.584067339 +0000 UTC m=+44.177886380" lastFinishedPulling="2026-04-20 14:55:54.941538915 +0000 UTC m=+48.535357956" observedRunningTime="2026-04-20 14:55:55.132050872 +0000 UTC m=+48.725869934" watchObservedRunningTime="2026-04-20 14:55:55.132638866 +0000 UTC m=+48.726457928" Apr 20 14:55:56.759889 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:56.759851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:56.759915 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:55:56.759952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:56.760006 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:56.760023 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-665f5fd44d-xtws4: secret "image-registry-tls" not found Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:56.760036 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:56.760066 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:56.760091 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert podName:e0053f0d-ea66-4e0b-950d-23c42c995f23 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.760072774 +0000 UTC m=+66.353891815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert") pod "ingress-canary-ktxxq" (UID: "e0053f0d-ea66-4e0b-950d-23c42c995f23") : secret "canary-serving-cert" not found Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:56.760107 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls podName:b1326961-f2f6-425b-9b48-4892eb2c7d31 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.760099112 +0000 UTC m=+66.353918153 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls") pod "image-registry-665f5fd44d-xtws4" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31") : secret "image-registry-tls" not found Apr 20 14:55:56.760324 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:55:56.760129 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls podName:f51becfe-6707-48a0-8930-b1feea33fb21 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.760114913 +0000 UTC m=+66.353933955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls") pod "dns-default-8vlv5" (UID: "f51becfe-6707-48a0-8930-b1feea33fb21") : secret "dns-default-metrics-tls" not found Apr 20 14:56:06.054157 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:06.054127 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qpw9h" Apr 20 14:56:07.296116 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:07.296089 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58fm4_b535020a-3ebe-44bb-8180-63bb281aceff/dns-node-resolver/0.log" Apr 20 14:56:08.295496 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:08.295468 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gk5rl_7be4d4a0-b5c2-4857-a3ae-245ad4430c7c/node-ca/0.log" Apr 20 14:56:12.680132 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.680089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:56:12.682290 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.682271 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:56:12.691142 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:56:12.691107 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:56:12.691241 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:56:12.691209 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs podName:d1dafe36-2ae8-4593-82df-fbff4eee87b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:57:16.691173339 +0000 UTC m=+130.284992383 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs") pod "network-metrics-daemon-7v79q" (UID: "d1dafe36-2ae8-4593-82df-fbff4eee87b1") : secret "metrics-daemon-secret" not found Apr 20 14:56:12.781037 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.781003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:56:12.781231 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.781066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:56:12.781231 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.781089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:56:12.781231 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.781111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:56:12.783550 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.783514 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:56:12.783663 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.783567 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f51becfe-6707-48a0-8930-b1feea33fb21-metrics-tls\") pod \"dns-default-8vlv5\" (UID: \"f51becfe-6707-48a0-8930-b1feea33fb21\") " pod="openshift-dns/dns-default-8vlv5" Apr 20 14:56:12.783726 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.783707 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"image-registry-665f5fd44d-xtws4\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:56:12.783762 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.783739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0053f0d-ea66-4e0b-950d-23c42c995f23-cert\") pod \"ingress-canary-ktxxq\" (UID: \"e0053f0d-ea66-4e0b-950d-23c42c995f23\") " pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:56:12.793546 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.793524 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:56:12.804544 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.804521 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsxkc\" (UniqueName: \"kubernetes.io/projected/cb271ee0-fe50-4ec5-a58b-e4cde09671b7-kube-api-access-zsxkc\") pod \"network-check-target-x9fss\" (UID: \"cb271ee0-fe50-4ec5-a58b-e4cde09671b7\") " pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:56:12.841868 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.841841 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6x78v\"" Apr 20 14:56:12.849948 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.849921 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:56:12.964009 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.963944 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fgmdb\"" Apr 20 14:56:12.969385 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.969359 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x9fss"] Apr 20 14:56:12.971916 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.971896 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:56:12.972097 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:56:12.972065 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb271ee0_fe50_4ec5_a58b_e4cde09671b7.slice/crio-181c7ac117c865730d9b7cd98e96c289b02aaf2920b82a942f925f5be3459007 WatchSource:0}: Error finding container 181c7ac117c865730d9b7cd98e96c289b02aaf2920b82a942f925f5be3459007: Status 404 returned error can't find the container with id 181c7ac117c865730d9b7cd98e96c289b02aaf2920b82a942f925f5be3459007 Apr 20 14:56:12.978964 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.978943 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tr5vz\"" Apr 20 14:56:12.987176 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:12.987154 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8vlv5" Apr 20 14:56:13.003322 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.003298 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xg95b\"" Apr 20 14:56:13.011549 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.011523 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ktxxq" Apr 20 14:56:13.116741 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.116702 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-665f5fd44d-xtws4"] Apr 20 14:56:13.119160 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:56:13.119130 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1326961_f2f6_425b_9b48_4892eb2c7d31.slice/crio-63d433551c8ebc7f66e83652b90e6c1a44e9f9c747d2e67938279e086b385fac WatchSource:0}: Error finding container 63d433551c8ebc7f66e83652b90e6c1a44e9f9c747d2e67938279e086b385fac: Status 404 returned error can't find the container with id 63d433551c8ebc7f66e83652b90e6c1a44e9f9c747d2e67938279e086b385fac Apr 20 14:56:13.133790 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.133729 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8vlv5"] Apr 20 14:56:13.136006 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:56:13.135979 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf51becfe_6707_48a0_8930_b1feea33fb21.slice/crio-48ead208e70fc97927dddc35fd079aa4491ede7008e1c0f4fc6515a67b656a68 WatchSource:0}: Error finding container 48ead208e70fc97927dddc35fd079aa4491ede7008e1c0f4fc6515a67b656a68: Status 404 returned error can't find the container with id 48ead208e70fc97927dddc35fd079aa4491ede7008e1c0f4fc6515a67b656a68 Apr 20 14:56:13.149364 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.149337 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ktxxq"] Apr 20 14:56:13.151755 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:56:13.151727 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0053f0d_ea66_4e0b_950d_23c42c995f23.slice/crio-81826fafc183ea3fc0837b36b6d55558615c926dec0407da7b2f8bc5ffc833db WatchSource:0}: Error finding container 81826fafc183ea3fc0837b36b6d55558615c926dec0407da7b2f8bc5ffc833db: Status 404 returned error can't find the container with id 81826fafc183ea3fc0837b36b6d55558615c926dec0407da7b2f8bc5ffc833db Apr 20 14:56:13.168198 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.168162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8vlv5" event={"ID":"f51becfe-6707-48a0-8930-b1feea33fb21","Type":"ContainerStarted","Data":"48ead208e70fc97927dddc35fd079aa4491ede7008e1c0f4fc6515a67b656a68"} Apr 20 14:56:13.169340 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.169234 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ktxxq" event={"ID":"e0053f0d-ea66-4e0b-950d-23c42c995f23","Type":"ContainerStarted","Data":"81826fafc183ea3fc0837b36b6d55558615c926dec0407da7b2f8bc5ffc833db"} Apr 20 14:56:13.170380 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.170354 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" event={"ID":"b1326961-f2f6-425b-9b48-4892eb2c7d31","Type":"ContainerStarted","Data":"63d433551c8ebc7f66e83652b90e6c1a44e9f9c747d2e67938279e086b385fac"} Apr 20 14:56:13.171309 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:13.171283 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x9fss" event={"ID":"cb271ee0-fe50-4ec5-a58b-e4cde09671b7","Type":"ContainerStarted","Data":"181c7ac117c865730d9b7cd98e96c289b02aaf2920b82a942f925f5be3459007"} Apr 20 14:56:14.176513 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:14.176478 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" event={"ID":"b1326961-f2f6-425b-9b48-4892eb2c7d31","Type":"ContainerStarted","Data":"a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9"} Apr 20 14:56:14.177159 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:14.176657 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:56:14.198482 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:14.198429 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" podStartSLOduration=67.198415389 podStartE2EDuration="1m7.198415389s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:14.197189028 +0000 UTC m=+67.791008108" watchObservedRunningTime="2026-04-20 14:56:14.198415389 +0000 UTC m=+67.792234455" Apr 20 14:56:17.186635 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.186602 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8vlv5" event={"ID":"f51becfe-6707-48a0-8930-b1feea33fb21","Type":"ContainerStarted","Data":"aee8fb75e03c9105aeb6ae67502bc5206ada01181996d0fd14cfc982efa63855"} Apr 20 14:56:17.186635 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.186636 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8vlv5" event={"ID":"f51becfe-6707-48a0-8930-b1feea33fb21","Type":"ContainerStarted","Data":"6cc4413e2cd28ece17f74ad37002e7ea174f32510aa15afdf9d9dda28f3ea8f4"} Apr 20 14:56:17.187169 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.186717 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8vlv5" Apr 20 14:56:17.187974 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.187952 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ktxxq" event={"ID":"e0053f0d-ea66-4e0b-950d-23c42c995f23","Type":"ContainerStarted","Data":"765c80d795e5316e63e323693fc9d437ed82bf4a76b148f0136d6650546fc0c7"} Apr 20 14:56:17.189185 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.189158 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x9fss" event={"ID":"cb271ee0-fe50-4ec5-a58b-e4cde09671b7","Type":"ContainerStarted","Data":"e0e783ed4b8432508ddc87052396d76d2f6a74f25d0df81670a2ebae79ba35ca"} Apr 20 14:56:17.189299 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.189285 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:56:17.203011 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.202967 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8vlv5" podStartSLOduration=33.924286341 podStartE2EDuration="37.202954699s" podCreationTimestamp="2026-04-20 14:55:40 +0000 UTC" firstStartedPulling="2026-04-20 14:56:13.137859442 +0000 UTC m=+66.731678483" lastFinishedPulling="2026-04-20 14:56:16.416527798 +0000 UTC m=+70.010346841" observedRunningTime="2026-04-20 14:56:17.201994518 +0000 UTC m=+70.795813580" watchObservedRunningTime="2026-04-20 14:56:17.202954699 +0000 UTC m=+70.796773761" Apr 20 14:56:17.217100 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.217051 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-x9fss" podStartSLOduration=66.763299389 podStartE2EDuration="1m10.217036194s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:56:12.975545975 +0000 UTC m=+66.569365025" lastFinishedPulling="2026-04-20 14:56:16.429282779 +0000 UTC m=+70.023101830" observedRunningTime="2026-04-20 14:56:17.215164355 +0000 UTC m=+70.808983431" watchObservedRunningTime="2026-04-20 14:56:17.217036194 +0000 UTC m=+70.810855258" Apr 20 14:56:17.232252 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:17.232213 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ktxxq" podStartSLOduration=33.968078485 podStartE2EDuration="37.232203021s" podCreationTimestamp="2026-04-20 14:55:40 +0000 UTC" firstStartedPulling="2026-04-20 14:56:13.153877128 +0000 UTC m=+66.747696173" lastFinishedPulling="2026-04-20 14:56:16.418001654 +0000 UTC m=+70.011820709" observedRunningTime="2026-04-20 14:56:17.231442089 +0000 UTC m=+70.825261174" watchObservedRunningTime="2026-04-20 14:56:17.232203021 +0000 UTC m=+70.826022083" Apr 20 14:56:22.421850 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.421820 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fsfq2"] Apr 20 14:56:22.468658 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.468624 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fsfq2"] Apr 20 14:56:22.468815 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.468768 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.471260 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.471235 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:56:22.471583 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.471570 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:56:22.471775 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.471748 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:56:22.471831 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.471748 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-n4j2j\"" Apr 20 14:56:22.474892 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.474872 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:56:22.562024 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.561999 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b19cde52-4a17-4909-9fca-1ee609bd3a49-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.562181 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.562037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b19cde52-4a17-4909-9fca-1ee609bd3a49-data-volume\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.562181 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.562061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b19cde52-4a17-4909-9fca-1ee609bd3a49-crio-socket\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.562181 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.562139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b19cde52-4a17-4909-9fca-1ee609bd3a49-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.562279 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.562194 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgrlj\" (UniqueName: \"kubernetes.io/projected/b19cde52-4a17-4909-9fca-1ee609bd3a49-kube-api-access-kgrlj\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.663513 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.663480 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b19cde52-4a17-4909-9fca-1ee609bd3a49-data-volume\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.663513 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.663518 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b19cde52-4a17-4909-9fca-1ee609bd3a49-crio-socket\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.663774 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.663548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b19cde52-4a17-4909-9fca-1ee609bd3a49-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.663774 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.663605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgrlj\" (UniqueName: \"kubernetes.io/projected/b19cde52-4a17-4909-9fca-1ee609bd3a49-kube-api-access-kgrlj\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.663774 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.663705 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b19cde52-4a17-4909-9fca-1ee609bd3a49-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.663774 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.663760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b19cde52-4a17-4909-9fca-1ee609bd3a49-crio-socket\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.663935 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.663916 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b19cde52-4a17-4909-9fca-1ee609bd3a49-data-volume\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.664122 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.664094 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b19cde52-4a17-4909-9fca-1ee609bd3a49-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.665892 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.665876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b19cde52-4a17-4909-9fca-1ee609bd3a49-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.671833 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.671816 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgrlj\" (UniqueName: \"kubernetes.io/projected/b19cde52-4a17-4909-9fca-1ee609bd3a49-kube-api-access-kgrlj\") pod \"insights-runtime-extractor-fsfq2\" (UID: \"b19cde52-4a17-4909-9fca-1ee609bd3a49\") " pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.778046 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.777952 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fsfq2" Apr 20 14:56:22.890447 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:22.890414 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fsfq2"] Apr 20 14:56:23.205790 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:23.205756 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fsfq2" event={"ID":"b19cde52-4a17-4909-9fca-1ee609bd3a49","Type":"ContainerStarted","Data":"9ed349206b42811268a3f568b4f644733d8d5f179f3e3f38f96da9320e46d169"} Apr 20 14:56:23.205790 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:23.205792 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fsfq2" event={"ID":"b19cde52-4a17-4909-9fca-1ee609bd3a49","Type":"ContainerStarted","Data":"a43f0ab50b1d94b3924afc60ccefef67e7c10b90064a17913e8f2fd6c5cad4a6"} Apr 20 14:56:24.212620 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:24.212574 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fsfq2" event={"ID":"b19cde52-4a17-4909-9fca-1ee609bd3a49","Type":"ContainerStarted","Data":"0979198d49e882074d4bb00d55e4c1164354917ca7c4d170e1e3bf03a8552c8a"} Apr 20 14:56:26.219460 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:26.219422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fsfq2" event={"ID":"b19cde52-4a17-4909-9fca-1ee609bd3a49","Type":"ContainerStarted","Data":"29074bd0062bdb8c60269735b1858346197e4b69aadf49bcee5eea3cac4a5a76"} Apr 20 14:56:26.235297 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:26.235244 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fsfq2" podStartSLOduration=1.249980006 podStartE2EDuration="4.235231s" podCreationTimestamp="2026-04-20 14:56:22 +0000 UTC" firstStartedPulling="2026-04-20 14:56:23.036562883 +0000 UTC m=+76.630381937" lastFinishedPulling="2026-04-20 14:56:26.021813878 +0000 UTC m=+79.615632931" observedRunningTime="2026-04-20 14:56:26.235106015 +0000 UTC m=+79.828925103" watchObservedRunningTime="2026-04-20 14:56:26.235231 +0000 UTC m=+79.829050063" Apr 20 14:56:27.193580 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:27.193550 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8vlv5" Apr 20 14:56:35.183679 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:35.183649 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:56:44.776754 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:44.776715 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-665f5fd44d-xtws4"] Apr 20 14:56:45.924814 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.924780 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qzs5m"] Apr 20 14:56:45.929477 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.929456 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:45.931956 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.931930 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:56:45.932127 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.931933 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:56:45.932127 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.932059 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:56:45.932127 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.932071 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:56:45.932127 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.932083 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-cx2mf\"" Apr 20 14:56:45.932744 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.932730 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:56:45.932813 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:45.932770 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:56:46.039109 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-sys\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.039109 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039115 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-wtmp\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.039357 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039137 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.039357 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-textfile\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.039357 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039183 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk96v\" (UniqueName: \"kubernetes.io/projected/1d6abef2-4fbf-4446-985e-849ae60e2a9b-kube-api-access-pk96v\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.039357 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-tls\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.039357 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039340 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-root\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.039517 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039388 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d6abef2-4fbf-4446-985e-849ae60e2a9b-metrics-client-ca\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.039517 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.039440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140478 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-sys\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140478 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140477 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-wtmp\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140498 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-textfile\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pk96v\" (UniqueName: \"kubernetes.io/projected/1d6abef2-4fbf-4446-985e-849ae60e2a9b-kube-api-access-pk96v\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-sys\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-tls\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-root\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140622 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-wtmp\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140626 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d6abef2-4fbf-4446-985e-849ae60e2a9b-metrics-client-ca\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140700 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.140752 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:56:46.140730 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 14:56:46.141152 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:56:46.140819 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-tls podName:1d6abef2-4fbf-4446-985e-849ae60e2a9b nodeName:}" failed. No retries permitted until 2026-04-20 14:56:46.640797435 +0000 UTC m=+100.234616490 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-tls") pod "node-exporter-qzs5m" (UID: "1d6abef2-4fbf-4446-985e-849ae60e2a9b") : secret "node-exporter-tls" not found Apr 20 14:56:46.141152 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.140861 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1d6abef2-4fbf-4446-985e-849ae60e2a9b-root\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.141152 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.141003 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-textfile\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.141254 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.141185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1d6abef2-4fbf-4446-985e-849ae60e2a9b-metrics-client-ca\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.141353 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.141327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-accelerators-collector-config\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.142994 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.142975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.149533 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.149508 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk96v\" (UniqueName: \"kubernetes.io/projected/1d6abef2-4fbf-4446-985e-849ae60e2a9b-kube-api-access-pk96v\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.645727 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.645667 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-tls\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.647943 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.647919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1d6abef2-4fbf-4446-985e-849ae60e2a9b-node-exporter-tls\") pod \"node-exporter-qzs5m\" (UID: \"1d6abef2-4fbf-4446-985e-849ae60e2a9b\") " pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.838022 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:46.837993 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qzs5m" Apr 20 14:56:46.848088 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:56:46.848062 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d6abef2_4fbf_4446_985e_849ae60e2a9b.slice/crio-848feef08d77894d1826a3a706c2bd6371ba0034dba9ee978a46de5244845f3a WatchSource:0}: Error finding container 848feef08d77894d1826a3a706c2bd6371ba0034dba9ee978a46de5244845f3a: Status 404 returned error can't find the container with id 848feef08d77894d1826a3a706c2bd6371ba0034dba9ee978a46de5244845f3a Apr 20 14:56:47.271774 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:47.271738 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzs5m" event={"ID":"1d6abef2-4fbf-4446-985e-849ae60e2a9b","Type":"ContainerStarted","Data":"848feef08d77894d1826a3a706c2bd6371ba0034dba9ee978a46de5244845f3a"} Apr 20 14:56:48.194589 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:48.194562 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x9fss" Apr 20 14:56:48.278221 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:48.278188 2571 generic.go:358] "Generic (PLEG): container finished" podID="1d6abef2-4fbf-4446-985e-849ae60e2a9b" containerID="65cc2edfa3a654ca171e222421ae67775f883a135fd8040abb751a3895b728a9" exitCode=0 Apr 20 14:56:48.278591 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:48.278241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzs5m" event={"ID":"1d6abef2-4fbf-4446-985e-849ae60e2a9b","Type":"ContainerDied","Data":"65cc2edfa3a654ca171e222421ae67775f883a135fd8040abb751a3895b728a9"} Apr 20 14:56:49.282901 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:49.282859 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzs5m" event={"ID":"1d6abef2-4fbf-4446-985e-849ae60e2a9b","Type":"ContainerStarted","Data":"2fea53e04b4b3d39585b10a535197d9b26447da86f7a8d167cd7359f46e218b6"} Apr 20 14:56:49.283411 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:49.283377 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qzs5m" event={"ID":"1d6abef2-4fbf-4446-985e-849ae60e2a9b","Type":"ContainerStarted","Data":"88c713497e82b5ae17e1ebaf392c4d9a0b789f5eaffe4172ad4e7f4412a43c6a"} Apr 20 14:56:49.302743 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:56:49.302678 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qzs5m" podStartSLOduration=3.539786744 podStartE2EDuration="4.30266043s" podCreationTimestamp="2026-04-20 14:56:45 +0000 UTC" firstStartedPulling="2026-04-20 14:56:46.850287641 +0000 UTC m=+100.444106681" lastFinishedPulling="2026-04-20 14:56:47.613161326 +0000 UTC m=+101.206980367" observedRunningTime="2026-04-20 14:56:49.302461907 +0000 UTC m=+102.896280971" watchObservedRunningTime="2026-04-20 14:56:49.30266043 +0000 UTC m=+102.896479494" Apr 20 14:57:04.128971 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:04.128910 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" podUID="74a04e11-0a0e-437a-9ca8-d8e402ebb666" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:57:09.800436 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:09.800368 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" podUID="b1326961-f2f6-425b-9b48-4892eb2c7d31" containerName="registry" containerID="cri-o://a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9" gracePeriod=30 Apr 20 14:57:10.041566 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.041540 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:57:10.116706 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.116614 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-image-registry-private-configuration\") pod \"b1326961-f2f6-425b-9b48-4892eb2c7d31\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " Apr 20 14:57:10.116706 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.116646 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6vzw\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-kube-api-access-z6vzw\") pod \"b1326961-f2f6-425b-9b48-4892eb2c7d31\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " Apr 20 14:57:10.116706 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.116678 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") pod \"b1326961-f2f6-425b-9b48-4892eb2c7d31\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " Apr 20 14:57:10.116930 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.116729 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-certificates\") pod \"b1326961-f2f6-425b-9b48-4892eb2c7d31\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " Apr 20 14:57:10.116930 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.116757 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-installation-pull-secrets\") pod \"b1326961-f2f6-425b-9b48-4892eb2c7d31\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " Apr 20 14:57:10.116930 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.116792 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-bound-sa-token\") pod \"b1326961-f2f6-425b-9b48-4892eb2c7d31\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " Apr 20 14:57:10.116930 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.116824 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1326961-f2f6-425b-9b48-4892eb2c7d31-ca-trust-extracted\") pod \"b1326961-f2f6-425b-9b48-4892eb2c7d31\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " Apr 20 14:57:10.116930 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.116879 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-trusted-ca\") pod \"b1326961-f2f6-425b-9b48-4892eb2c7d31\" (UID: \"b1326961-f2f6-425b-9b48-4892eb2c7d31\") " Apr 20 14:57:10.117368 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.117298 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b1326961-f2f6-425b-9b48-4892eb2c7d31" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:10.117487 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.117395 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b1326961-f2f6-425b-9b48-4892eb2c7d31" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:10.119415 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.119389 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b1326961-f2f6-425b-9b48-4892eb2c7d31" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:10.119558 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.119524 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b1326961-f2f6-425b-9b48-4892eb2c7d31" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:10.119662 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.119571 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-kube-api-access-z6vzw" (OuterVolumeSpecName: "kube-api-access-z6vzw") pod "b1326961-f2f6-425b-9b48-4892eb2c7d31" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31"). InnerVolumeSpecName "kube-api-access-z6vzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:10.119662 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.119616 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b1326961-f2f6-425b-9b48-4892eb2c7d31" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:10.119662 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.119641 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b1326961-f2f6-425b-9b48-4892eb2c7d31" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:10.125792 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.125765 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1326961-f2f6-425b-9b48-4892eb2c7d31-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b1326961-f2f6-425b-9b48-4892eb2c7d31" (UID: "b1326961-f2f6-425b-9b48-4892eb2c7d31"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:57:10.217588 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.217559 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-trusted-ca\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 14:57:10.217735 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.217590 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-image-registry-private-configuration\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 14:57:10.217735 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.217607 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6vzw\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-kube-api-access-z6vzw\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 14:57:10.217735 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.217618 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-tls\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 14:57:10.217735 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.217628 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1326961-f2f6-425b-9b48-4892eb2c7d31-registry-certificates\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 14:57:10.217735 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.217636 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1326961-f2f6-425b-9b48-4892eb2c7d31-installation-pull-secrets\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 14:57:10.217735 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.217645 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1326961-f2f6-425b-9b48-4892eb2c7d31-bound-sa-token\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 14:57:10.217735 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.217655 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1326961-f2f6-425b-9b48-4892eb2c7d31-ca-trust-extracted\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 14:57:10.335789 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.335752 2571 generic.go:358] "Generic (PLEG): container finished" podID="b1326961-f2f6-425b-9b48-4892eb2c7d31" containerID="a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9" exitCode=0 Apr 20 14:57:10.335948 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.335822 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" Apr 20 14:57:10.335948 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.335834 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" event={"ID":"b1326961-f2f6-425b-9b48-4892eb2c7d31","Type":"ContainerDied","Data":"a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9"} Apr 20 14:57:10.335948 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.335871 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665f5fd44d-xtws4" event={"ID":"b1326961-f2f6-425b-9b48-4892eb2c7d31","Type":"ContainerDied","Data":"63d433551c8ebc7f66e83652b90e6c1a44e9f9c747d2e67938279e086b385fac"} Apr 20 14:57:10.335948 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.335897 2571 scope.go:117] "RemoveContainer" containerID="a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9" Apr 20 14:57:10.343607 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.343590 2571 scope.go:117] "RemoveContainer" containerID="a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9" Apr 20 14:57:10.343889 ip-10-0-141-9 kubenswrapper[2571]: E0420 14:57:10.343869 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9\": container with ID starting with a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9 not found: ID does not exist" containerID="a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9" Apr 20 14:57:10.343954 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.343898 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9"} err="failed to get container status \"a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9\": rpc error: code = NotFound desc = could not find container \"a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9\": container with ID starting with a8ecc59994f8bc2424a8090fceadcfbe6d1f6e7e3a04064ac7fb460d51c392e9 not found: ID does not exist" Apr 20 14:57:10.355499 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.355476 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-665f5fd44d-xtws4"] Apr 20 14:57:10.360302 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.360281 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-665f5fd44d-xtws4"] Apr 20 14:57:10.931104 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:10.931070 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1326961-f2f6-425b-9b48-4892eb2c7d31" path="/var/lib/kubelet/pods/b1326961-f2f6-425b-9b48-4892eb2c7d31/volumes" Apr 20 14:57:14.128328 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:14.128291 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" podUID="74a04e11-0a0e-437a-9ca8-d8e402ebb666" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:57:16.765659 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:16.765617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:57:16.768416 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:16.768394 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d1dafe36-2ae8-4593-82df-fbff4eee87b1-metrics-certs\") pod \"network-metrics-daemon-7v79q\" (UID: \"d1dafe36-2ae8-4593-82df-fbff4eee87b1\") " pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:57:17.048409 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:17.048385 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6rqtq\"" Apr 20 14:57:17.056152 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:17.056130 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7v79q" Apr 20 14:57:17.167219 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:17.167190 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7v79q"] Apr 20 14:57:17.170284 ip-10-0-141-9 kubenswrapper[2571]: W0420 14:57:17.170261 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1dafe36_2ae8_4593_82df_fbff4eee87b1.slice/crio-cae490065a50ff3d77fc50e489ab77fccbad3b39d79a6d8c1eb1ac7cd9c3f44c WatchSource:0}: Error finding container cae490065a50ff3d77fc50e489ab77fccbad3b39d79a6d8c1eb1ac7cd9c3f44c: Status 404 returned error can't find the container with id cae490065a50ff3d77fc50e489ab77fccbad3b39d79a6d8c1eb1ac7cd9c3f44c Apr 20 14:57:17.358800 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:17.358715 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7v79q" event={"ID":"d1dafe36-2ae8-4593-82df-fbff4eee87b1","Type":"ContainerStarted","Data":"cae490065a50ff3d77fc50e489ab77fccbad3b39d79a6d8c1eb1ac7cd9c3f44c"} Apr 20 14:57:18.363203 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:18.363114 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7v79q" event={"ID":"d1dafe36-2ae8-4593-82df-fbff4eee87b1","Type":"ContainerStarted","Data":"3fd85724c4acf793f8b76144a925f612314d01558477fa85f5c757d9972a59ac"} Apr 20 14:57:18.363203 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:18.363157 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7v79q" event={"ID":"d1dafe36-2ae8-4593-82df-fbff4eee87b1","Type":"ContainerStarted","Data":"bb7ddfd247e2742cf118f766e8d3c52b3889ccb43e611cb19299b13886a9bbec"} Apr 20 14:57:18.379509 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:18.379453 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7v79q" podStartSLOduration=130.460211512 podStartE2EDuration="2m11.379437041s" podCreationTimestamp="2026-04-20 14:55:07 +0000 UTC" firstStartedPulling="2026-04-20 14:57:17.172015749 +0000 UTC m=+130.765834789" lastFinishedPulling="2026-04-20 14:57:18.091241277 +0000 UTC m=+131.685060318" observedRunningTime="2026-04-20 14:57:18.378390061 +0000 UTC m=+131.972209127" watchObservedRunningTime="2026-04-20 14:57:18.379437041 +0000 UTC m=+131.973256104" Apr 20 14:57:24.128595 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:24.128557 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" podUID="74a04e11-0a0e-437a-9ca8-d8e402ebb666" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 14:57:24.129051 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:24.128626 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" Apr 20 14:57:24.129126 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:24.129108 2571 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"cededa19626d1768f044ea287fa6d94849da35a864a5105919377320267de168"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 14:57:24.129162 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:24.129146 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" podUID="74a04e11-0a0e-437a-9ca8-d8e402ebb666" containerName="service-proxy" containerID="cri-o://cededa19626d1768f044ea287fa6d94849da35a864a5105919377320267de168" gracePeriod=30 Apr 20 14:57:24.379484 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:24.379387 2571 generic.go:358] "Generic (PLEG): container finished" podID="74a04e11-0a0e-437a-9ca8-d8e402ebb666" containerID="cededa19626d1768f044ea287fa6d94849da35a864a5105919377320267de168" exitCode=2 Apr 20 14:57:24.379484 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:24.379453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" event={"ID":"74a04e11-0a0e-437a-9ca8-d8e402ebb666","Type":"ContainerDied","Data":"cededa19626d1768f044ea287fa6d94849da35a864a5105919377320267de168"} Apr 20 14:57:24.379736 ip-10-0-141-9 kubenswrapper[2571]: I0420 14:57:24.379501 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-65d87bb68b-td26r" event={"ID":"74a04e11-0a0e-437a-9ca8-d8e402ebb666","Type":"ContainerStarted","Data":"39caea552298273f66f5b8f42cd88d5cc73cde72d4ac828a04a77a2bb250195a"} Apr 20 15:00:06.876671 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:00:06.876639 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 15:01:00.015622 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.015539 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-qlt9f"] Apr 20 15:01:00.016058 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.015787 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1326961-f2f6-425b-9b48-4892eb2c7d31" containerName="registry" Apr 20 15:01:00.016058 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.015800 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1326961-f2f6-425b-9b48-4892eb2c7d31" containerName="registry" Apr 20 15:01:00.016058 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.015843 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1326961-f2f6-425b-9b48-4892eb2c7d31" containerName="registry" Apr 20 15:01:00.018176 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.018159 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" Apr 20 15:01:00.020544 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.020523 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 15:01:00.020674 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.020592 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 15:01:00.021415 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.021397 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-k828n\"" Apr 20 15:01:00.026391 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.026364 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-qlt9f"] Apr 20 15:01:00.070098 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.070052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqwxf\" (UniqueName: \"kubernetes.io/projected/d083d60d-084b-46ad-b1d2-af5203a9eb76-kube-api-access-wqwxf\") pod \"cert-manager-cainjector-8966b78d4-qlt9f\" (UID: \"d083d60d-084b-46ad-b1d2-af5203a9eb76\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" Apr 20 15:01:00.070098 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.070112 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d083d60d-084b-46ad-b1d2-af5203a9eb76-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-qlt9f\" (UID: \"d083d60d-084b-46ad-b1d2-af5203a9eb76\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" Apr 20 15:01:00.171320 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.171268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d083d60d-084b-46ad-b1d2-af5203a9eb76-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-qlt9f\" (UID: \"d083d60d-084b-46ad-b1d2-af5203a9eb76\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" Apr 20 15:01:00.171496 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.171339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqwxf\" (UniqueName: \"kubernetes.io/projected/d083d60d-084b-46ad-b1d2-af5203a9eb76-kube-api-access-wqwxf\") pod \"cert-manager-cainjector-8966b78d4-qlt9f\" (UID: \"d083d60d-084b-46ad-b1d2-af5203a9eb76\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" Apr 20 15:01:00.179207 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.179178 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d083d60d-084b-46ad-b1d2-af5203a9eb76-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-qlt9f\" (UID: \"d083d60d-084b-46ad-b1d2-af5203a9eb76\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" Apr 20 15:01:00.179328 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.179231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqwxf\" (UniqueName: \"kubernetes.io/projected/d083d60d-084b-46ad-b1d2-af5203a9eb76-kube-api-access-wqwxf\") pod \"cert-manager-cainjector-8966b78d4-qlt9f\" (UID: \"d083d60d-084b-46ad-b1d2-af5203a9eb76\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" Apr 20 15:01:00.328231 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.328202 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" Apr 20 15:01:00.441125 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.440963 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-qlt9f"] Apr 20 15:01:00.443573 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:01:00.443545 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd083d60d_084b_46ad_b1d2_af5203a9eb76.slice/crio-b36b5872c753d6642c3f5ac59ca1dffe20ca68450f297dc974a9419e84db88ab WatchSource:0}: Error finding container b36b5872c753d6642c3f5ac59ca1dffe20ca68450f297dc974a9419e84db88ab: Status 404 returned error can't find the container with id b36b5872c753d6642c3f5ac59ca1dffe20ca68450f297dc974a9419e84db88ab Apr 20 15:01:00.445347 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.445326 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:01:00.916013 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:00.915966 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" event={"ID":"d083d60d-084b-46ad-b1d2-af5203a9eb76","Type":"ContainerStarted","Data":"b36b5872c753d6642c3f5ac59ca1dffe20ca68450f297dc974a9419e84db88ab"} Apr 20 15:01:03.927397 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:03.927365 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" event={"ID":"d083d60d-084b-46ad-b1d2-af5203a9eb76","Type":"ContainerStarted","Data":"d3e53b206aec180cbcb240698b35b1043837d3027e8c9373c15d55553da7b396"} Apr 20 15:01:03.943248 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:03.943204 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-qlt9f" podStartSLOduration=1.656190936 podStartE2EDuration="4.943189975s" podCreationTimestamp="2026-04-20 15:00:59 +0000 UTC" firstStartedPulling="2026-04-20 15:01:00.445459218 +0000 UTC m=+354.039278260" lastFinishedPulling="2026-04-20 15:01:03.732458258 +0000 UTC m=+357.326277299" observedRunningTime="2026-04-20 15:01:03.941547781 +0000 UTC m=+357.535366845" watchObservedRunningTime="2026-04-20 15:01:03.943189975 +0000 UTC m=+357.537009038" Apr 20 15:01:15.320485 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.320447 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-pf54d"] Apr 20 15:01:15.323719 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.323702 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-pf54d" Apr 20 15:01:15.325907 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.325885 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-r2nps\"" Apr 20 15:01:15.332303 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.332281 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-pf54d"] Apr 20 15:01:15.380124 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.380091 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mhx\" (UniqueName: \"kubernetes.io/projected/fd2345a5-2203-4e2e-916b-2713bc66d418-kube-api-access-p7mhx\") pod \"cert-manager-759f64656b-pf54d\" (UID: \"fd2345a5-2203-4e2e-916b-2713bc66d418\") " pod="cert-manager/cert-manager-759f64656b-pf54d" Apr 20 15:01:15.380292 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.380141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd2345a5-2203-4e2e-916b-2713bc66d418-bound-sa-token\") pod \"cert-manager-759f64656b-pf54d\" (UID: \"fd2345a5-2203-4e2e-916b-2713bc66d418\") " pod="cert-manager/cert-manager-759f64656b-pf54d" Apr 20 15:01:15.480904 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.480868 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd2345a5-2203-4e2e-916b-2713bc66d418-bound-sa-token\") pod \"cert-manager-759f64656b-pf54d\" (UID: \"fd2345a5-2203-4e2e-916b-2713bc66d418\") " pod="cert-manager/cert-manager-759f64656b-pf54d" Apr 20 15:01:15.480990 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.480932 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mhx\" (UniqueName: \"kubernetes.io/projected/fd2345a5-2203-4e2e-916b-2713bc66d418-kube-api-access-p7mhx\") pod \"cert-manager-759f64656b-pf54d\" (UID: \"fd2345a5-2203-4e2e-916b-2713bc66d418\") " pod="cert-manager/cert-manager-759f64656b-pf54d" Apr 20 15:01:15.488614 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.488582 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd2345a5-2203-4e2e-916b-2713bc66d418-bound-sa-token\") pod \"cert-manager-759f64656b-pf54d\" (UID: \"fd2345a5-2203-4e2e-916b-2713bc66d418\") " pod="cert-manager/cert-manager-759f64656b-pf54d" Apr 20 15:01:15.488810 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.488793 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mhx\" (UniqueName: \"kubernetes.io/projected/fd2345a5-2203-4e2e-916b-2713bc66d418-kube-api-access-p7mhx\") pod \"cert-manager-759f64656b-pf54d\" (UID: \"fd2345a5-2203-4e2e-916b-2713bc66d418\") " pod="cert-manager/cert-manager-759f64656b-pf54d" Apr 20 15:01:15.632652 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.632575 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-pf54d" Apr 20 15:01:15.747268 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.747232 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-pf54d"] Apr 20 15:01:15.750676 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:01:15.750649 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd2345a5_2203_4e2e_916b_2713bc66d418.slice/crio-63e2831835355c0c56f24011d96934805a1736babcd8ec5a8560d9786e2dd9ea WatchSource:0}: Error finding container 63e2831835355c0c56f24011d96934805a1736babcd8ec5a8560d9786e2dd9ea: Status 404 returned error can't find the container with id 63e2831835355c0c56f24011d96934805a1736babcd8ec5a8560d9786e2dd9ea Apr 20 15:01:15.962182 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.962091 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-pf54d" event={"ID":"fd2345a5-2203-4e2e-916b-2713bc66d418","Type":"ContainerStarted","Data":"1f20f176b7167354972641a9df23b4428c473b058d3d22bdfe9756996c5f9ba8"} Apr 20 15:01:15.962182 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.962129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-pf54d" event={"ID":"fd2345a5-2203-4e2e-916b-2713bc66d418","Type":"ContainerStarted","Data":"63e2831835355c0c56f24011d96934805a1736babcd8ec5a8560d9786e2dd9ea"} Apr 20 15:01:15.978696 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:15.978633 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-pf54d" podStartSLOduration=0.978619547 podStartE2EDuration="978.619547ms" podCreationTimestamp="2026-04-20 15:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:01:15.976930923 +0000 UTC m=+369.570749996" watchObservedRunningTime="2026-04-20 15:01:15.978619547 +0000 UTC m=+369.572438609" Apr 20 15:01:24.625056 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.625019 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx"] Apr 20 15:01:24.628115 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.628099 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.630480 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.630456 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 15:01:24.630613 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.630528 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 15:01:24.630786 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.630765 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 15:01:24.630878 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.630804 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-vsz6j\"" Apr 20 15:01:24.631249 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.631229 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 15:01:24.641855 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.641821 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx"] Apr 20 15:01:24.745513 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.745482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/791a6494-d0fc-4ab3-9d90-1d60f971189e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.745703 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.745549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws5st\" (UniqueName: \"kubernetes.io/projected/791a6494-d0fc-4ab3-9d90-1d60f971189e-kube-api-access-ws5st\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.745703 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.745578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/791a6494-d0fc-4ab3-9d90-1d60f971189e-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.846306 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.846277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/791a6494-d0fc-4ab3-9d90-1d60f971189e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.846492 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.846359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws5st\" (UniqueName: \"kubernetes.io/projected/791a6494-d0fc-4ab3-9d90-1d60f971189e-kube-api-access-ws5st\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.846492 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.846396 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/791a6494-d0fc-4ab3-9d90-1d60f971189e-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.848835 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.848808 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/791a6494-d0fc-4ab3-9d90-1d60f971189e-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.848982 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.848869 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/791a6494-d0fc-4ab3-9d90-1d60f971189e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.853312 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.853290 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws5st\" (UniqueName: \"kubernetes.io/projected/791a6494-d0fc-4ab3-9d90-1d60f971189e-kube-api-access-ws5st\") pod \"opendatahub-operator-controller-manager-99ff97f7d-7lwvx\" (UID: \"791a6494-d0fc-4ab3-9d90-1d60f971189e\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:24.938803 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:24.938721 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:25.063521 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:25.063492 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx"] Apr 20 15:01:25.066946 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:01:25.066917 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod791a6494_d0fc_4ab3_9d90_1d60f971189e.slice/crio-03ded65b72ae0474f4f7170d4392c76e63e01a1e7b517fa0c821ba157d7eb793 WatchSource:0}: Error finding container 03ded65b72ae0474f4f7170d4392c76e63e01a1e7b517fa0c821ba157d7eb793: Status 404 returned error can't find the container with id 03ded65b72ae0474f4f7170d4392c76e63e01a1e7b517fa0c821ba157d7eb793 Apr 20 15:01:25.988799 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:25.988759 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" event={"ID":"791a6494-d0fc-4ab3-9d90-1d60f971189e","Type":"ContainerStarted","Data":"03ded65b72ae0474f4f7170d4392c76e63e01a1e7b517fa0c821ba157d7eb793"} Apr 20 15:01:27.995563 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:27.995520 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" event={"ID":"791a6494-d0fc-4ab3-9d90-1d60f971189e","Type":"ContainerStarted","Data":"bb7fbef4a1f621cc54b4107ad67ddfedb592d6cd265ed4cb06785e742c7a21ae"} Apr 20 15:01:27.996043 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:27.995702 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:28.015495 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:28.015437 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" podStartSLOduration=1.6118144110000001 podStartE2EDuration="4.015418938s" podCreationTimestamp="2026-04-20 15:01:24 +0000 UTC" firstStartedPulling="2026-04-20 15:01:25.068524966 +0000 UTC m=+378.662344007" lastFinishedPulling="2026-04-20 15:01:27.472129491 +0000 UTC m=+381.065948534" observedRunningTime="2026-04-20 15:01:28.013455681 +0000 UTC m=+381.607274744" watchObservedRunningTime="2026-04-20 15:01:28.015418938 +0000 UTC m=+381.609238004" Apr 20 15:01:39.000289 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:39.000259 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-7lwvx" Apr 20 15:01:42.901003 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.900965 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7"] Apr 20 15:01:42.904846 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.904825 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:42.907397 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.907241 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 15:01:42.907397 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.907275 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 15:01:42.908161 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.908064 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 15:01:42.908318 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.908255 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-fsm29\"" Apr 20 15:01:42.908485 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.908464 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 15:01:42.915363 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.915333 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7"] Apr 20 15:01:42.979481 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.979446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f02d06d1-9f91-408c-8620-5c9999714030-tmp\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:42.979634 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.979502 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f02d06d1-9f91-408c-8620-5c9999714030-tls-certs\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:42.979634 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:42.979537 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j86g7\" (UniqueName: \"kubernetes.io/projected/f02d06d1-9f91-408c-8620-5c9999714030-kube-api-access-j86g7\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:43.080753 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:43.080716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f02d06d1-9f91-408c-8620-5c9999714030-tmp\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:43.080916 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:43.080777 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f02d06d1-9f91-408c-8620-5c9999714030-tls-certs\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:43.080916 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:43.080803 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j86g7\" (UniqueName: \"kubernetes.io/projected/f02d06d1-9f91-408c-8620-5c9999714030-kube-api-access-j86g7\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:43.082966 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:43.082932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f02d06d1-9f91-408c-8620-5c9999714030-tmp\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:43.083164 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:43.083146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f02d06d1-9f91-408c-8620-5c9999714030-tls-certs\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:43.089790 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:43.089743 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j86g7\" (UniqueName: \"kubernetes.io/projected/f02d06d1-9f91-408c-8620-5c9999714030-kube-api-access-j86g7\") pod \"kube-auth-proxy-85d448cc4f-8p9z7\" (UID: \"f02d06d1-9f91-408c-8620-5c9999714030\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:43.218074 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:43.218001 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" Apr 20 15:01:43.343393 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:43.343368 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7"] Apr 20 15:01:43.345890 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:01:43.345859 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf02d06d1_9f91_408c_8620_5c9999714030.slice/crio-f7515f357c62992037acabd7c31a8a41cc833184588adac0bbb8aa8983407f77 WatchSource:0}: Error finding container f7515f357c62992037acabd7c31a8a41cc833184588adac0bbb8aa8983407f77: Status 404 returned error can't find the container with id f7515f357c62992037acabd7c31a8a41cc833184588adac0bbb8aa8983407f77 Apr 20 15:01:44.037307 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:44.037257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" event={"ID":"f02d06d1-9f91-408c-8620-5c9999714030","Type":"ContainerStarted","Data":"f7515f357c62992037acabd7c31a8a41cc833184588adac0bbb8aa8983407f77"} Apr 20 15:01:46.630006 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.629967 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-c52gz"] Apr 20 15:01:46.631912 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.631896 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:46.633922 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.633906 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 15:01:46.633999 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.633954 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-t8mmr\"" Apr 20 15:01:46.641067 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.641038 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-c52gz"] Apr 20 15:01:46.710240 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.710209 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91ee7df-7568-45f0-81cf-3eceaa62d414-cert\") pod \"odh-model-controller-858dbf95b8-c52gz\" (UID: \"c91ee7df-7568-45f0-81cf-3eceaa62d414\") " pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:46.710424 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.710295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbdl4\" (UniqueName: \"kubernetes.io/projected/c91ee7df-7568-45f0-81cf-3eceaa62d414-kube-api-access-gbdl4\") pod \"odh-model-controller-858dbf95b8-c52gz\" (UID: \"c91ee7df-7568-45f0-81cf-3eceaa62d414\") " pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:46.811530 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.811488 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbdl4\" (UniqueName: \"kubernetes.io/projected/c91ee7df-7568-45f0-81cf-3eceaa62d414-kube-api-access-gbdl4\") pod \"odh-model-controller-858dbf95b8-c52gz\" (UID: \"c91ee7df-7568-45f0-81cf-3eceaa62d414\") " pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:46.811726 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.811558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91ee7df-7568-45f0-81cf-3eceaa62d414-cert\") pod \"odh-model-controller-858dbf95b8-c52gz\" (UID: \"c91ee7df-7568-45f0-81cf-3eceaa62d414\") " pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:46.811726 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:01:46.811713 2571 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 15:01:46.811846 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:01:46.811785 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91ee7df-7568-45f0-81cf-3eceaa62d414-cert podName:c91ee7df-7568-45f0-81cf-3eceaa62d414 nodeName:}" failed. No retries permitted until 2026-04-20 15:01:47.311763733 +0000 UTC m=+400.905582775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c91ee7df-7568-45f0-81cf-3eceaa62d414-cert") pod "odh-model-controller-858dbf95b8-c52gz" (UID: "c91ee7df-7568-45f0-81cf-3eceaa62d414") : secret "odh-model-controller-webhook-cert" not found Apr 20 15:01:46.820196 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:46.820164 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbdl4\" (UniqueName: \"kubernetes.io/projected/c91ee7df-7568-45f0-81cf-3eceaa62d414-kube-api-access-gbdl4\") pod \"odh-model-controller-858dbf95b8-c52gz\" (UID: \"c91ee7df-7568-45f0-81cf-3eceaa62d414\") " pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:47.048210 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:47.048173 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" event={"ID":"f02d06d1-9f91-408c-8620-5c9999714030","Type":"ContainerStarted","Data":"4ba6cc2e8d75e80ca72ba38046313a7411cf9ce68ae44f50ce49c31d9dfc0016"} Apr 20 15:01:47.066359 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:47.066315 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-8p9z7" podStartSLOduration=2.20136533 podStartE2EDuration="5.066301534s" podCreationTimestamp="2026-04-20 15:01:42 +0000 UTC" firstStartedPulling="2026-04-20 15:01:43.347514487 +0000 UTC m=+396.941333529" lastFinishedPulling="2026-04-20 15:01:46.212450675 +0000 UTC m=+399.806269733" observedRunningTime="2026-04-20 15:01:47.065554065 +0000 UTC m=+400.659373135" watchObservedRunningTime="2026-04-20 15:01:47.066301534 +0000 UTC m=+400.660120630" Apr 20 15:01:47.315853 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:47.315761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91ee7df-7568-45f0-81cf-3eceaa62d414-cert\") pod \"odh-model-controller-858dbf95b8-c52gz\" (UID: \"c91ee7df-7568-45f0-81cf-3eceaa62d414\") " pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:47.318354 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:47.318322 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c91ee7df-7568-45f0-81cf-3eceaa62d414-cert\") pod \"odh-model-controller-858dbf95b8-c52gz\" (UID: \"c91ee7df-7568-45f0-81cf-3eceaa62d414\") " pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:47.546729 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:47.546678 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:47.668699 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:47.668641 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-c52gz"] Apr 20 15:01:47.671306 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:01:47.671278 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc91ee7df_7568_45f0_81cf_3eceaa62d414.slice/crio-f4a70266efae0c8bd33953eef0ee63985d423e2e24352feff7efaa1bb6fd8b8c WatchSource:0}: Error finding container f4a70266efae0c8bd33953eef0ee63985d423e2e24352feff7efaa1bb6fd8b8c: Status 404 returned error can't find the container with id f4a70266efae0c8bd33953eef0ee63985d423e2e24352feff7efaa1bb6fd8b8c Apr 20 15:01:48.052040 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:48.052007 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" event={"ID":"c91ee7df-7568-45f0-81cf-3eceaa62d414","Type":"ContainerStarted","Data":"f4a70266efae0c8bd33953eef0ee63985d423e2e24352feff7efaa1bb6fd8b8c"} Apr 20 15:01:51.064254 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:51.064219 2571 generic.go:358] "Generic (PLEG): container finished" podID="c91ee7df-7568-45f0-81cf-3eceaa62d414" containerID="ffaf24575cb127a21e119365946576c7a7467264ec311dd779650a804b8121bc" exitCode=1 Apr 20 15:01:51.064562 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:51.064299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" event={"ID":"c91ee7df-7568-45f0-81cf-3eceaa62d414","Type":"ContainerDied","Data":"ffaf24575cb127a21e119365946576c7a7467264ec311dd779650a804b8121bc"} Apr 20 15:01:51.064562 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:51.064503 2571 scope.go:117] "RemoveContainer" containerID="ffaf24575cb127a21e119365946576c7a7467264ec311dd779650a804b8121bc" Apr 20 15:01:52.068744 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.068702 2571 generic.go:358] "Generic (PLEG): container finished" podID="c91ee7df-7568-45f0-81cf-3eceaa62d414" containerID="e304d0563298ad8f0f52157845603210031a836028bac19c55acdf8c1520e5e9" exitCode=1 Apr 20 15:01:52.069135 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.068751 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" event={"ID":"c91ee7df-7568-45f0-81cf-3eceaa62d414","Type":"ContainerDied","Data":"e304d0563298ad8f0f52157845603210031a836028bac19c55acdf8c1520e5e9"} Apr 20 15:01:52.069135 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.068790 2571 scope.go:117] "RemoveContainer" containerID="ffaf24575cb127a21e119365946576c7a7467264ec311dd779650a804b8121bc" Apr 20 15:01:52.069135 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.069045 2571 scope.go:117] "RemoveContainer" containerID="e304d0563298ad8f0f52157845603210031a836028bac19c55acdf8c1520e5e9" Apr 20 15:01:52.069263 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:01:52.069218 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-c52gz_opendatahub(c91ee7df-7568-45f0-81cf-3eceaa62d414)\"" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" podUID="c91ee7df-7568-45f0-81cf-3eceaa62d414" Apr 20 15:01:52.730128 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.730092 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2w8jd"] Apr 20 15:01:52.732843 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.732822 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:52.735090 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.735068 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 15:01:52.735214 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.735107 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-4lgjg\"" Apr 20 15:01:52.744974 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.744945 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2w8jd"] Apr 20 15:01:52.860073 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.860040 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhf6w\" (UniqueName: \"kubernetes.io/projected/72f95425-c5fd-4adb-9d68-fb054a4682e8-kube-api-access-dhf6w\") pod \"kserve-controller-manager-856948b99f-2w8jd\" (UID: \"72f95425-c5fd-4adb-9d68-fb054a4682e8\") " pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:52.860214 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.860122 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f95425-c5fd-4adb-9d68-fb054a4682e8-cert\") pod \"kserve-controller-manager-856948b99f-2w8jd\" (UID: \"72f95425-c5fd-4adb-9d68-fb054a4682e8\") " pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:52.960980 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.960944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f95425-c5fd-4adb-9d68-fb054a4682e8-cert\") pod \"kserve-controller-manager-856948b99f-2w8jd\" (UID: \"72f95425-c5fd-4adb-9d68-fb054a4682e8\") " pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:52.961160 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.961014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhf6w\" (UniqueName: \"kubernetes.io/projected/72f95425-c5fd-4adb-9d68-fb054a4682e8-kube-api-access-dhf6w\") pod \"kserve-controller-manager-856948b99f-2w8jd\" (UID: \"72f95425-c5fd-4adb-9d68-fb054a4682e8\") " pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:52.961160 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:01:52.961103 2571 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 15:01:52.961243 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:01:52.961187 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72f95425-c5fd-4adb-9d68-fb054a4682e8-cert podName:72f95425-c5fd-4adb-9d68-fb054a4682e8 nodeName:}" failed. No retries permitted until 2026-04-20 15:01:53.461166925 +0000 UTC m=+407.054985965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72f95425-c5fd-4adb-9d68-fb054a4682e8-cert") pod "kserve-controller-manager-856948b99f-2w8jd" (UID: "72f95425-c5fd-4adb-9d68-fb054a4682e8") : secret "kserve-webhook-server-cert" not found Apr 20 15:01:52.969267 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:52.969243 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhf6w\" (UniqueName: \"kubernetes.io/projected/72f95425-c5fd-4adb-9d68-fb054a4682e8-kube-api-access-dhf6w\") pod \"kserve-controller-manager-856948b99f-2w8jd\" (UID: \"72f95425-c5fd-4adb-9d68-fb054a4682e8\") " pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:53.072937 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:53.072913 2571 scope.go:117] "RemoveContainer" containerID="e304d0563298ad8f0f52157845603210031a836028bac19c55acdf8c1520e5e9" Apr 20 15:01:53.073287 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:01:53.073076 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-c52gz_opendatahub(c91ee7df-7568-45f0-81cf-3eceaa62d414)\"" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" podUID="c91ee7df-7568-45f0-81cf-3eceaa62d414" Apr 20 15:01:53.465284 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:53.465196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f95425-c5fd-4adb-9d68-fb054a4682e8-cert\") pod \"kserve-controller-manager-856948b99f-2w8jd\" (UID: \"72f95425-c5fd-4adb-9d68-fb054a4682e8\") " pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:53.467515 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:53.467492 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72f95425-c5fd-4adb-9d68-fb054a4682e8-cert\") pod \"kserve-controller-manager-856948b99f-2w8jd\" (UID: \"72f95425-c5fd-4adb-9d68-fb054a4682e8\") " pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:53.643772 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:53.643726 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:53.759020 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:53.758987 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-2w8jd"] Apr 20 15:01:53.762174 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:01:53.762144 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72f95425_c5fd_4adb_9d68_fb054a4682e8.slice/crio-a4ea803338e20fb8a4a6ef2d44d1678755d240126805cbcd7dd54eb20d885fd1 WatchSource:0}: Error finding container a4ea803338e20fb8a4a6ef2d44d1678755d240126805cbcd7dd54eb20d885fd1: Status 404 returned error can't find the container with id a4ea803338e20fb8a4a6ef2d44d1678755d240126805cbcd7dd54eb20d885fd1 Apr 20 15:01:54.075852 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:54.075821 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" event={"ID":"72f95425-c5fd-4adb-9d68-fb054a4682e8","Type":"ContainerStarted","Data":"a4ea803338e20fb8a4a6ef2d44d1678755d240126805cbcd7dd54eb20d885fd1"} Apr 20 15:01:57.086921 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:57.086882 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" event={"ID":"72f95425-c5fd-4adb-9d68-fb054a4682e8","Type":"ContainerStarted","Data":"a6b05c69fa42f98f7c3dd1fa2ae701dd49349dd4f734315132ea042cde33cf0c"} Apr 20 15:01:57.087345 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:57.087010 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:01:57.103845 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:57.103803 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" podStartSLOduration=2.004264564 podStartE2EDuration="5.103791074s" podCreationTimestamp="2026-04-20 15:01:52 +0000 UTC" firstStartedPulling="2026-04-20 15:01:53.763461544 +0000 UTC m=+407.357280585" lastFinishedPulling="2026-04-20 15:01:56.862988045 +0000 UTC m=+410.456807095" observedRunningTime="2026-04-20 15:01:57.102716953 +0000 UTC m=+410.696536018" watchObservedRunningTime="2026-04-20 15:01:57.103791074 +0000 UTC m=+410.697610135" Apr 20 15:01:57.547160 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:57.547123 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:01:57.547519 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:01:57.547505 2571 scope.go:117] "RemoveContainer" containerID="e304d0563298ad8f0f52157845603210031a836028bac19c55acdf8c1520e5e9" Apr 20 15:01:57.547724 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:01:57.547706 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-c52gz_opendatahub(c91ee7df-7568-45f0-81cf-3eceaa62d414)\"" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" podUID="c91ee7df-7568-45f0-81cf-3eceaa62d414" Apr 20 15:02:00.886470 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:00.886437 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-b29dj"] Apr 20 15:02:00.892732 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:00.892711 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:00.895379 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:00.895355 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 15:02:00.895508 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:00.895362 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-5jhs7\"" Apr 20 15:02:00.895662 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:00.895646 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 15:02:00.904391 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:00.904367 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-b29dj"] Apr 20 15:02:01.027867 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:01.027838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d72d424a-640b-4e39-818c-142bb612df3a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-b29dj\" (UID: \"d72d424a-640b-4e39-818c-142bb612df3a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:01.028029 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:01.027907 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7gzh\" (UniqueName: \"kubernetes.io/projected/d72d424a-640b-4e39-818c-142bb612df3a-kube-api-access-c7gzh\") pod \"servicemesh-operator3-55f49c5f94-b29dj\" (UID: \"d72d424a-640b-4e39-818c-142bb612df3a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:01.128506 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:01.128461 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7gzh\" (UniqueName: \"kubernetes.io/projected/d72d424a-640b-4e39-818c-142bb612df3a-kube-api-access-c7gzh\") pod \"servicemesh-operator3-55f49c5f94-b29dj\" (UID: \"d72d424a-640b-4e39-818c-142bb612df3a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:01.128729 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:01.128524 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d72d424a-640b-4e39-818c-142bb612df3a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-b29dj\" (UID: \"d72d424a-640b-4e39-818c-142bb612df3a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:01.136045 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:01.136011 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d72d424a-640b-4e39-818c-142bb612df3a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-b29dj\" (UID: \"d72d424a-640b-4e39-818c-142bb612df3a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:01.147206 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:01.147148 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7gzh\" (UniqueName: \"kubernetes.io/projected/d72d424a-640b-4e39-818c-142bb612df3a-kube-api-access-c7gzh\") pod \"servicemesh-operator3-55f49c5f94-b29dj\" (UID: \"d72d424a-640b-4e39-818c-142bb612df3a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:01.202234 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:01.202205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:01.322868 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:01.322826 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-b29dj"] Apr 20 15:02:01.325862 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:02:01.325833 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd72d424a_640b_4e39_818c_142bb612df3a.slice/crio-80c297be0ad21a26fd17fc9335cee4383611e330f4bebe36255b87a44754e60e WatchSource:0}: Error finding container 80c297be0ad21a26fd17fc9335cee4383611e330f4bebe36255b87a44754e60e: Status 404 returned error can't find the container with id 80c297be0ad21a26fd17fc9335cee4383611e330f4bebe36255b87a44754e60e Apr 20 15:02:02.102931 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:02.102893 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" event={"ID":"d72d424a-640b-4e39-818c-142bb612df3a","Type":"ContainerStarted","Data":"80c297be0ad21a26fd17fc9335cee4383611e330f4bebe36255b87a44754e60e"} Apr 20 15:02:04.113608 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:04.112847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" event={"ID":"d72d424a-640b-4e39-818c-142bb612df3a","Type":"ContainerStarted","Data":"46682a666cec09d8f759023471c0a59abf329ee4e82fdb4e6d1ea54bc5908894"} Apr 20 15:02:04.113608 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:04.113499 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:04.138789 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:04.138720 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" podStartSLOduration=1.522230159 podStartE2EDuration="4.138701518s" podCreationTimestamp="2026-04-20 15:02:00 +0000 UTC" firstStartedPulling="2026-04-20 15:02:01.328413641 +0000 UTC m=+414.922232681" lastFinishedPulling="2026-04-20 15:02:03.944884987 +0000 UTC m=+417.538704040" observedRunningTime="2026-04-20 15:02:04.138437309 +0000 UTC m=+417.732256372" watchObservedRunningTime="2026-04-20 15:02:04.138701518 +0000 UTC m=+417.732520582" Apr 20 15:02:07.547756 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:07.547712 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:02:07.548230 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:07.548111 2571 scope.go:117] "RemoveContainer" containerID="e304d0563298ad8f0f52157845603210031a836028bac19c55acdf8c1520e5e9" Apr 20 15:02:08.127124 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:08.127041 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" event={"ID":"c91ee7df-7568-45f0-81cf-3eceaa62d414","Type":"ContainerStarted","Data":"0ec7c7f6e181b7467992fe9ab8aac11cb2fffe2192c1c9446052ce89f625a63c"} Apr 20 15:02:08.127297 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:08.127275 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:02:08.145535 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:08.145468 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" podStartSLOduration=2.017283344 podStartE2EDuration="22.145454095s" podCreationTimestamp="2026-04-20 15:01:46 +0000 UTC" firstStartedPulling="2026-04-20 15:01:47.672620664 +0000 UTC m=+401.266439706" lastFinishedPulling="2026-04-20 15:02:07.800791401 +0000 UTC m=+421.394610457" observedRunningTime="2026-04-20 15:02:08.144626022 +0000 UTC m=+421.738445086" watchObservedRunningTime="2026-04-20 15:02:08.145454095 +0000 UTC m=+421.739273159" Apr 20 15:02:16.122040 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:16.122006 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-b29dj" Apr 20 15:02:19.132444 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:19.132413 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-c52gz" Apr 20 15:02:28.096647 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:28.096569 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-2w8jd" Apr 20 15:02:31.626535 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.626497 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg"] Apr 20 15:02:31.628904 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.628888 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.631096 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.631071 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 15:02:31.631096 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.631090 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 15:02:31.631308 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.631079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-svs86\"" Apr 20 15:02:31.631308 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.631072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 15:02:31.631410 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.631390 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 15:02:31.638613 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.638587 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg"] Apr 20 15:02:31.648840 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.648817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnvjr\" (UniqueName: \"kubernetes.io/projected/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-kube-api-access-lnvjr\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.648951 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.648850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.648951 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.648871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.648951 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.648927 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.649064 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.649005 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.649064 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.649058 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.649132 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.649085 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.749865 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.749833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.750049 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.749876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.750049 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.749901 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnvjr\" (UniqueName: \"kubernetes.io/projected/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-kube-api-access-lnvjr\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.750049 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.749922 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.750049 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.749942 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.750049 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.749964 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.750049 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.749986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.750482 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.750452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.752438 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.752405 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.752438 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.752426 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.752438 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.752433 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.752604 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.752439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.757621 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.757595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.757781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.757764 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnvjr\" (UniqueName: \"kubernetes.io/projected/cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b-kube-api-access-lnvjr\") pod \"istiod-openshift-gateway-55ff986f96-wxmzg\" (UID: \"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:31.940333 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:31.940248 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:32.066633 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:32.066582 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg"] Apr 20 15:02:32.069594 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:02:32.069559 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdfe2eff_f0b0_4f55_8a5c_ced2cd895e3b.slice/crio-28394f0afe3623440e6299e380d083a0e7285793c72d3897687b555181a5b7ea WatchSource:0}: Error finding container 28394f0afe3623440e6299e380d083a0e7285793c72d3897687b555181a5b7ea: Status 404 returned error can't find the container with id 28394f0afe3623440e6299e380d083a0e7285793c72d3897687b555181a5b7ea Apr 20 15:02:32.203013 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:32.202923 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" event={"ID":"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b","Type":"ContainerStarted","Data":"28394f0afe3623440e6299e380d083a0e7285793c72d3897687b555181a5b7ea"} Apr 20 15:02:35.772198 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:35.772154 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 15:02:35.772445 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:35.772243 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 15:02:36.217523 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:36.217478 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" event={"ID":"cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b","Type":"ContainerStarted","Data":"0c91435050e78a629d38822f9fa091d1d313acb38233dc1d20ffd206435f3a4a"} Apr 20 15:02:36.217818 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:36.217793 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:02:36.219217 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:36.219195 2571 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-wxmzg container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 15:02:36.219311 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:36.219238 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" podUID="cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:02:36.235776 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:36.235730 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" podStartSLOduration=1.53540218 podStartE2EDuration="5.235714608s" podCreationTimestamp="2026-04-20 15:02:31 +0000 UTC" firstStartedPulling="2026-04-20 15:02:32.071594409 +0000 UTC m=+445.665413456" lastFinishedPulling="2026-04-20 15:02:35.771906842 +0000 UTC m=+449.365725884" observedRunningTime="2026-04-20 15:02:36.234306433 +0000 UTC m=+449.828125516" watchObservedRunningTime="2026-04-20 15:02:36.235714608 +0000 UTC m=+449.829533672" Apr 20 15:02:37.221734 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:02:37.221700 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-wxmzg" Apr 20 15:03:26.807417 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:26.807378 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-5hh7c"] Apr 20 15:03:26.809433 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:26.809417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" Apr 20 15:03:26.812725 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:26.812704 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-pjdtg\"" Apr 20 15:03:26.813010 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:26.812993 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:03:26.813537 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:26.813523 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:03:26.820353 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:26.820331 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-5hh7c"] Apr 20 15:03:26.959650 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:26.959616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-979bq\" (UniqueName: \"kubernetes.io/projected/5fb4b079-9139-44f5-aa44-e44a368ab5f4-kube-api-access-979bq\") pod \"authorino-operator-657f44b778-5hh7c\" (UID: \"5fb4b079-9139-44f5-aa44-e44a368ab5f4\") " pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" Apr 20 15:03:27.060089 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:27.060000 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-979bq\" (UniqueName: \"kubernetes.io/projected/5fb4b079-9139-44f5-aa44-e44a368ab5f4-kube-api-access-979bq\") pod \"authorino-operator-657f44b778-5hh7c\" (UID: \"5fb4b079-9139-44f5-aa44-e44a368ab5f4\") " pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" Apr 20 15:03:27.068540 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:27.068513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-979bq\" (UniqueName: \"kubernetes.io/projected/5fb4b079-9139-44f5-aa44-e44a368ab5f4-kube-api-access-979bq\") pod \"authorino-operator-657f44b778-5hh7c\" (UID: \"5fb4b079-9139-44f5-aa44-e44a368ab5f4\") " pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" Apr 20 15:03:27.119312 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:27.119281 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" Apr 20 15:03:27.238942 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:27.238917 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-5hh7c"] Apr 20 15:03:27.241609 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:03:27.241579 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fb4b079_9139_44f5_aa44_e44a368ab5f4.slice/crio-2a8b5ce80fa2d9fa72daa3624a2a73d3ff6cfee5590bc8bb30eb3b14b7ef18b4 WatchSource:0}: Error finding container 2a8b5ce80fa2d9fa72daa3624a2a73d3ff6cfee5590bc8bb30eb3b14b7ef18b4: Status 404 returned error can't find the container with id 2a8b5ce80fa2d9fa72daa3624a2a73d3ff6cfee5590bc8bb30eb3b14b7ef18b4 Apr 20 15:03:27.384138 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:27.384053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" event={"ID":"5fb4b079-9139-44f5-aa44-e44a368ab5f4","Type":"ContainerStarted","Data":"2a8b5ce80fa2d9fa72daa3624a2a73d3ff6cfee5590bc8bb30eb3b14b7ef18b4"} Apr 20 15:03:29.392236 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:29.392153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" event={"ID":"5fb4b079-9139-44f5-aa44-e44a368ab5f4","Type":"ContainerStarted","Data":"9e6161ed6ccf9587fe6a0d66e63d9804545b970111f048fe3a0c6a786ad27bef"} Apr 20 15:03:29.392640 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:29.392259 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" Apr 20 15:03:29.421723 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:29.421658 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" podStartSLOduration=1.594027481 podStartE2EDuration="3.421644047s" podCreationTimestamp="2026-04-20 15:03:26 +0000 UTC" firstStartedPulling="2026-04-20 15:03:27.243524472 +0000 UTC m=+500.837343517" lastFinishedPulling="2026-04-20 15:03:29.071141034 +0000 UTC m=+502.664960083" observedRunningTime="2026-04-20 15:03:29.419974927 +0000 UTC m=+503.013794014" watchObservedRunningTime="2026-04-20 15:03:29.421644047 +0000 UTC m=+503.015463110" Apr 20 15:03:40.397556 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:03:40.397522 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-5hh7c" Apr 20 15:04:13.607642 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.607602 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-qvwb2"] Apr 20 15:04:13.610964 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.610910 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:13.613960 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.613936 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jwvf8\"" Apr 20 15:04:13.614039 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.613968 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 15:04:13.618727 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.618704 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-qvwb2"] Apr 20 15:04:13.701884 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.701850 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-qvwb2"] Apr 20 15:04:13.721170 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.721135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49w82\" (UniqueName: \"kubernetes.io/projected/d1346b49-e21c-4426-a95b-1bbc9abbbb42-kube-api-access-49w82\") pod \"limitador-limitador-7d549b5b-qvwb2\" (UID: \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\") " pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:13.721330 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.721236 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1346b49-e21c-4426-a95b-1bbc9abbbb42-config-file\") pod \"limitador-limitador-7d549b5b-qvwb2\" (UID: \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\") " pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:13.822664 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.822611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49w82\" (UniqueName: \"kubernetes.io/projected/d1346b49-e21c-4426-a95b-1bbc9abbbb42-kube-api-access-49w82\") pod \"limitador-limitador-7d549b5b-qvwb2\" (UID: \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\") " pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:13.822872 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.822719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1346b49-e21c-4426-a95b-1bbc9abbbb42-config-file\") pod \"limitador-limitador-7d549b5b-qvwb2\" (UID: \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\") " pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:13.823379 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.823356 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1346b49-e21c-4426-a95b-1bbc9abbbb42-config-file\") pod \"limitador-limitador-7d549b5b-qvwb2\" (UID: \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\") " pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:13.832303 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.832278 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49w82\" (UniqueName: \"kubernetes.io/projected/d1346b49-e21c-4426-a95b-1bbc9abbbb42-kube-api-access-49w82\") pod \"limitador-limitador-7d549b5b-qvwb2\" (UID: \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\") " pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:13.921723 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:13.921606 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:14.040659 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:14.040626 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-qvwb2"] Apr 20 15:04:14.044913 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:04:14.044883 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1346b49_e21c_4426_a95b_1bbc9abbbb42.slice/crio-92483f50d0b6d202f4ca0f6f3441edfac6bfe1b32071c54766f915f6451560a9 WatchSource:0}: Error finding container 92483f50d0b6d202f4ca0f6f3441edfac6bfe1b32071c54766f915f6451560a9: Status 404 returned error can't find the container with id 92483f50d0b6d202f4ca0f6f3441edfac6bfe1b32071c54766f915f6451560a9 Apr 20 15:04:14.531726 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:14.531663 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" event={"ID":"d1346b49-e21c-4426-a95b-1bbc9abbbb42","Type":"ContainerStarted","Data":"92483f50d0b6d202f4ca0f6f3441edfac6bfe1b32071c54766f915f6451560a9"} Apr 20 15:04:17.542225 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:17.542189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" event={"ID":"d1346b49-e21c-4426-a95b-1bbc9abbbb42","Type":"ContainerStarted","Data":"11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba"} Apr 20 15:04:17.542595 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:17.542313 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:17.560272 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:17.560225 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" podStartSLOduration=1.957278501 podStartE2EDuration="4.56021129s" podCreationTimestamp="2026-04-20 15:04:13 +0000 UTC" firstStartedPulling="2026-04-20 15:04:14.046718484 +0000 UTC m=+547.640537525" lastFinishedPulling="2026-04-20 15:04:16.649651269 +0000 UTC m=+550.243470314" observedRunningTime="2026-04-20 15:04:17.558595388 +0000 UTC m=+551.152414451" watchObservedRunningTime="2026-04-20 15:04:17.56021129 +0000 UTC m=+551.154030352" Apr 20 15:04:28.546521 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:28.546489 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:29.036228 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.036130 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-qvwb2"] Apr 20 15:04:29.036403 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.036378 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" podUID="d1346b49-e21c-4426-a95b-1bbc9abbbb42" containerName="limitador" containerID="cri-o://11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba" gracePeriod=30 Apr 20 15:04:29.572536 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.572507 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:29.580052 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.580024 2571 generic.go:358] "Generic (PLEG): container finished" podID="d1346b49-e21c-4426-a95b-1bbc9abbbb42" containerID="11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba" exitCode=0 Apr 20 15:04:29.580167 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.580082 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" Apr 20 15:04:29.580167 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.580107 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" event={"ID":"d1346b49-e21c-4426-a95b-1bbc9abbbb42","Type":"ContainerDied","Data":"11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba"} Apr 20 15:04:29.580167 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.580150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-qvwb2" event={"ID":"d1346b49-e21c-4426-a95b-1bbc9abbbb42","Type":"ContainerDied","Data":"92483f50d0b6d202f4ca0f6f3441edfac6bfe1b32071c54766f915f6451560a9"} Apr 20 15:04:29.580311 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.580171 2571 scope.go:117] "RemoveContainer" containerID="11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba" Apr 20 15:04:29.587092 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.587079 2571 scope.go:117] "RemoveContainer" containerID="11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba" Apr 20 15:04:29.587323 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:04:29.587307 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba\": container with ID starting with 11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba not found: ID does not exist" containerID="11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba" Apr 20 15:04:29.587382 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.587330 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba"} err="failed to get container status \"11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba\": rpc error: code = NotFound desc = could not find container \"11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba\": container with ID starting with 11d995aa701fb917ec14aafe144386c3197074cf2f7007e45e9db15bdb4dedba not found: ID does not exist" Apr 20 15:04:29.644725 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.644628 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1346b49-e21c-4426-a95b-1bbc9abbbb42-config-file\") pod \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\" (UID: \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\") " Apr 20 15:04:29.644725 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.644699 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49w82\" (UniqueName: \"kubernetes.io/projected/d1346b49-e21c-4426-a95b-1bbc9abbbb42-kube-api-access-49w82\") pod \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\" (UID: \"d1346b49-e21c-4426-a95b-1bbc9abbbb42\") " Apr 20 15:04:29.644999 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.644977 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1346b49-e21c-4426-a95b-1bbc9abbbb42-config-file" (OuterVolumeSpecName: "config-file") pod "d1346b49-e21c-4426-a95b-1bbc9abbbb42" (UID: "d1346b49-e21c-4426-a95b-1bbc9abbbb42"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:04:29.646826 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.646793 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1346b49-e21c-4426-a95b-1bbc9abbbb42-kube-api-access-49w82" (OuterVolumeSpecName: "kube-api-access-49w82") pod "d1346b49-e21c-4426-a95b-1bbc9abbbb42" (UID: "d1346b49-e21c-4426-a95b-1bbc9abbbb42"). InnerVolumeSpecName "kube-api-access-49w82". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:04:29.745325 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.745294 2571 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d1346b49-e21c-4426-a95b-1bbc9abbbb42-config-file\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 15:04:29.745325 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.745320 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49w82\" (UniqueName: \"kubernetes.io/projected/d1346b49-e21c-4426-a95b-1bbc9abbbb42-kube-api-access-49w82\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 15:04:29.900091 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.900021 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-qvwb2"] Apr 20 15:04:29.903344 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:29.903318 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-qvwb2"] Apr 20 15:04:30.931543 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:30.931500 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1346b49-e21c-4426-a95b-1bbc9abbbb42" path="/var/lib/kubelet/pods/d1346b49-e21c-4426-a95b-1bbc9abbbb42/volumes" Apr 20 15:04:34.887334 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.887296 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-swdnp"] Apr 20 15:04:34.887728 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.887584 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1346b49-e21c-4426-a95b-1bbc9abbbb42" containerName="limitador" Apr 20 15:04:34.887728 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.887595 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1346b49-e21c-4426-a95b-1bbc9abbbb42" containerName="limitador" Apr 20 15:04:34.887728 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.887659 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1346b49-e21c-4426-a95b-1bbc9abbbb42" containerName="limitador" Apr 20 15:04:34.890120 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.890099 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:34.892614 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.892591 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-65bbl\"" Apr 20 15:04:34.892715 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.892591 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 15:04:34.898391 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.898366 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-swdnp"] Apr 20 15:04:34.985284 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.985249 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8ncn\" (UniqueName: \"kubernetes.io/projected/7f824afc-9875-412c-a2f6-539a0944856a-kube-api-access-z8ncn\") pod \"postgres-868db5846d-swdnp\" (UID: \"7f824afc-9875-412c-a2f6-539a0944856a\") " pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:34.985428 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:34.985297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7f824afc-9875-412c-a2f6-539a0944856a-data\") pod \"postgres-868db5846d-swdnp\" (UID: \"7f824afc-9875-412c-a2f6-539a0944856a\") " pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:35.086214 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:35.086175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8ncn\" (UniqueName: \"kubernetes.io/projected/7f824afc-9875-412c-a2f6-539a0944856a-kube-api-access-z8ncn\") pod \"postgres-868db5846d-swdnp\" (UID: \"7f824afc-9875-412c-a2f6-539a0944856a\") " pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:35.086414 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:35.086225 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7f824afc-9875-412c-a2f6-539a0944856a-data\") pod \"postgres-868db5846d-swdnp\" (UID: \"7f824afc-9875-412c-a2f6-539a0944856a\") " pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:35.086649 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:35.086627 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7f824afc-9875-412c-a2f6-539a0944856a-data\") pod \"postgres-868db5846d-swdnp\" (UID: \"7f824afc-9875-412c-a2f6-539a0944856a\") " pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:35.094644 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:35.094621 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8ncn\" (UniqueName: \"kubernetes.io/projected/7f824afc-9875-412c-a2f6-539a0944856a-kube-api-access-z8ncn\") pod \"postgres-868db5846d-swdnp\" (UID: \"7f824afc-9875-412c-a2f6-539a0944856a\") " pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:35.201587 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:35.201494 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:35.320512 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:35.320488 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-swdnp"] Apr 20 15:04:35.323190 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:04:35.323159 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f824afc_9875_412c_a2f6_539a0944856a.slice/crio-6f7ecee3ce55c75393290afc193ad59bbc7d78172281bbe2cb97e74215349b9c WatchSource:0}: Error finding container 6f7ecee3ce55c75393290afc193ad59bbc7d78172281bbe2cb97e74215349b9c: Status 404 returned error can't find the container with id 6f7ecee3ce55c75393290afc193ad59bbc7d78172281bbe2cb97e74215349b9c Apr 20 15:04:35.602126 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:35.602090 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-swdnp" event={"ID":"7f824afc-9875-412c-a2f6-539a0944856a","Type":"ContainerStarted","Data":"6f7ecee3ce55c75393290afc193ad59bbc7d78172281bbe2cb97e74215349b9c"} Apr 20 15:04:40.621045 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:40.621009 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-swdnp" event={"ID":"7f824afc-9875-412c-a2f6-539a0944856a","Type":"ContainerStarted","Data":"c788fa7e96caf379fb2096993190fd9f3b5bb596afce0b43182fc6fba14bc712"} Apr 20 15:04:40.621549 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:40.621146 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:04:40.637771 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:40.637724 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-swdnp" podStartSLOduration=1.7563295399999999 podStartE2EDuration="6.637707829s" podCreationTimestamp="2026-04-20 15:04:34 +0000 UTC" firstStartedPulling="2026-04-20 15:04:35.324475153 +0000 UTC m=+568.918294194" lastFinishedPulling="2026-04-20 15:04:40.205853441 +0000 UTC m=+573.799672483" observedRunningTime="2026-04-20 15:04:40.636033585 +0000 UTC m=+574.229852650" watchObservedRunningTime="2026-04-20 15:04:40.637707829 +0000 UTC m=+574.231526890" Apr 20 15:04:46.651953 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:04:46.651923 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-swdnp" Apr 20 15:05:01.464197 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.464159 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-kjm6t"] Apr 20 15:05:01.467523 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.467504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-kjm6t" Apr 20 15:05:01.470006 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.469989 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 15:05:01.470798 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.470782 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 15:05:01.470798 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.470793 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-jxqjw\"" Apr 20 15:05:01.474975 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.474945 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-kjm6t"] Apr 20 15:05:01.593780 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.593741 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zths6\" (UniqueName: \"kubernetes.io/projected/0cf87d6e-65f7-4d31-a762-a7e4876cc261-kube-api-access-zths6\") pod \"keycloak-operator-5c4df598dd-kjm6t\" (UID: \"0cf87d6e-65f7-4d31-a762-a7e4876cc261\") " pod="keycloak-system/keycloak-operator-5c4df598dd-kjm6t" Apr 20 15:05:01.694872 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.694839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zths6\" (UniqueName: \"kubernetes.io/projected/0cf87d6e-65f7-4d31-a762-a7e4876cc261-kube-api-access-zths6\") pod \"keycloak-operator-5c4df598dd-kjm6t\" (UID: \"0cf87d6e-65f7-4d31-a762-a7e4876cc261\") " pod="keycloak-system/keycloak-operator-5c4df598dd-kjm6t" Apr 20 15:05:01.705148 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.705120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zths6\" (UniqueName: \"kubernetes.io/projected/0cf87d6e-65f7-4d31-a762-a7e4876cc261-kube-api-access-zths6\") pod \"keycloak-operator-5c4df598dd-kjm6t\" (UID: \"0cf87d6e-65f7-4d31-a762-a7e4876cc261\") " pod="keycloak-system/keycloak-operator-5c4df598dd-kjm6t" Apr 20 15:05:01.778224 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.778143 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-kjm6t" Apr 20 15:05:01.895907 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:01.895874 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-kjm6t"] Apr 20 15:05:01.899182 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:05:01.899152 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cf87d6e_65f7_4d31_a762_a7e4876cc261.slice/crio-65c72ddccc6fb810c3e2d1c51d6683adb7633743d81c5bf48de24bfd9ee9978e WatchSource:0}: Error finding container 65c72ddccc6fb810c3e2d1c51d6683adb7633743d81c5bf48de24bfd9ee9978e: Status 404 returned error can't find the container with id 65c72ddccc6fb810c3e2d1c51d6683adb7633743d81c5bf48de24bfd9ee9978e Apr 20 15:05:02.689560 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:02.689521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-kjm6t" event={"ID":"0cf87d6e-65f7-4d31-a762-a7e4876cc261","Type":"ContainerStarted","Data":"65c72ddccc6fb810c3e2d1c51d6683adb7633743d81c5bf48de24bfd9ee9978e"} Apr 20 15:05:08.710669 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:08.710581 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-kjm6t" event={"ID":"0cf87d6e-65f7-4d31-a762-a7e4876cc261","Type":"ContainerStarted","Data":"99ee9975c93566f36a74e6c9f55300037a89e3f1659daf8d011ae069333a49ad"} Apr 20 15:05:08.726511 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:05:08.726459 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-kjm6t" podStartSLOduration=1.22227286 podStartE2EDuration="7.726442539s" podCreationTimestamp="2026-04-20 15:05:01 +0000 UTC" firstStartedPulling="2026-04-20 15:05:01.900610817 +0000 UTC m=+595.494429858" lastFinishedPulling="2026-04-20 15:05:08.404780492 +0000 UTC m=+601.998599537" observedRunningTime="2026-04-20 15:05:08.725786897 +0000 UTC m=+602.319605954" watchObservedRunningTime="2026-04-20 15:05:08.726442539 +0000 UTC m=+602.320261609" Apr 20 15:15:00.144967 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.144930 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-472b4"] Apr 20 15:15:00.147993 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.147975 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" Apr 20 15:15:00.150461 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.150442 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-gn9x5\"" Apr 20 15:15:00.164090 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.164068 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-472b4"] Apr 20 15:15:00.238147 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.238115 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8j2b\" (UniqueName: \"kubernetes.io/projected/47eb57c4-3616-455a-b8b1-cc3133f42c3b-kube-api-access-f8j2b\") pod \"maas-api-key-cleanup-29611635-472b4\" (UID: \"47eb57c4-3616-455a-b8b1-cc3133f42c3b\") " pod="opendatahub/maas-api-key-cleanup-29611635-472b4" Apr 20 15:15:00.338713 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.338661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8j2b\" (UniqueName: \"kubernetes.io/projected/47eb57c4-3616-455a-b8b1-cc3133f42c3b-kube-api-access-f8j2b\") pod \"maas-api-key-cleanup-29611635-472b4\" (UID: \"47eb57c4-3616-455a-b8b1-cc3133f42c3b\") " pod="opendatahub/maas-api-key-cleanup-29611635-472b4" Apr 20 15:15:00.346792 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.346758 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8j2b\" (UniqueName: \"kubernetes.io/projected/47eb57c4-3616-455a-b8b1-cc3133f42c3b-kube-api-access-f8j2b\") pod \"maas-api-key-cleanup-29611635-472b4\" (UID: \"47eb57c4-3616-455a-b8b1-cc3133f42c3b\") " pod="opendatahub/maas-api-key-cleanup-29611635-472b4" Apr 20 15:15:00.457595 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.457524 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" Apr 20 15:15:00.574752 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.574624 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-472b4"] Apr 20 15:15:00.577389 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:15:00.577359 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47eb57c4_3616_455a_b8b1_cc3133f42c3b.slice/crio-aa18cbbf1de096f0b0256465ce0d0ba1085c794a9611ba1c98bb6ae909363922 WatchSource:0}: Error finding container aa18cbbf1de096f0b0256465ce0d0ba1085c794a9611ba1c98bb6ae909363922: Status 404 returned error can't find the container with id aa18cbbf1de096f0b0256465ce0d0ba1085c794a9611ba1c98bb6ae909363922 Apr 20 15:15:00.579149 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:00.579129 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:15:01.540678 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:01.540645 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" event={"ID":"47eb57c4-3616-455a-b8b1-cc3133f42c3b","Type":"ContainerStarted","Data":"aa18cbbf1de096f0b0256465ce0d0ba1085c794a9611ba1c98bb6ae909363922"} Apr 20 15:15:03.548326 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:03.548244 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" event={"ID":"47eb57c4-3616-455a-b8b1-cc3133f42c3b","Type":"ContainerStarted","Data":"c8a3a6edd974db13f402faca511dcfa4bc579fbc188d06fdbd02e7b9d5a1a02b"} Apr 20 15:15:03.564392 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:03.564342 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" podStartSLOduration=1.18327314 podStartE2EDuration="3.564330674s" podCreationTimestamp="2026-04-20 15:15:00 +0000 UTC" firstStartedPulling="2026-04-20 15:15:00.579314097 +0000 UTC m=+1194.173133150" lastFinishedPulling="2026-04-20 15:15:02.960371644 +0000 UTC m=+1196.554190684" observedRunningTime="2026-04-20 15:15:03.56199382 +0000 UTC m=+1197.155812895" watchObservedRunningTime="2026-04-20 15:15:03.564330674 +0000 UTC m=+1197.158149763" Apr 20 15:15:23.620572 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:23.620542 2571 generic.go:358] "Generic (PLEG): container finished" podID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerID="c8a3a6edd974db13f402faca511dcfa4bc579fbc188d06fdbd02e7b9d5a1a02b" exitCode=6 Apr 20 15:15:23.620921 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:23.620612 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" event={"ID":"47eb57c4-3616-455a-b8b1-cc3133f42c3b","Type":"ContainerDied","Data":"c8a3a6edd974db13f402faca511dcfa4bc579fbc188d06fdbd02e7b9d5a1a02b"} Apr 20 15:15:23.620993 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:23.620979 2571 scope.go:117] "RemoveContainer" containerID="c8a3a6edd974db13f402faca511dcfa4bc579fbc188d06fdbd02e7b9d5a1a02b" Apr 20 15:15:24.626531 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:24.626493 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" event={"ID":"47eb57c4-3616-455a-b8b1-cc3133f42c3b","Type":"ContainerStarted","Data":"2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad"} Apr 20 15:15:44.690621 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:44.690534 2571 generic.go:358] "Generic (PLEG): container finished" podID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerID="2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad" exitCode=6 Apr 20 15:15:44.690621 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:44.690606 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" event={"ID":"47eb57c4-3616-455a-b8b1-cc3133f42c3b","Type":"ContainerDied","Data":"2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad"} Apr 20 15:15:44.691114 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:44.690648 2571 scope.go:117] "RemoveContainer" containerID="c8a3a6edd974db13f402faca511dcfa4bc579fbc188d06fdbd02e7b9d5a1a02b" Apr 20 15:15:44.691114 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:44.691010 2571 scope.go:117] "RemoveContainer" containerID="2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad" Apr 20 15:15:44.691259 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:15:44.691232 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611635-472b4_opendatahub(47eb57c4-3616-455a-b8b1-cc3133f42c3b)\"" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" Apr 20 15:15:58.928878 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:58.928789 2571 scope.go:117] "RemoveContainer" containerID="2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad" Apr 20 15:15:59.742212 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:59.742177 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" event={"ID":"47eb57c4-3616-455a-b8b1-cc3133f42c3b","Type":"ContainerStarted","Data":"955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228"} Apr 20 15:15:59.968072 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:15:59.968037 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-472b4"] Apr 20 15:16:00.745478 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:00.745415 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" containerID="cri-o://955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228" gracePeriod=30 Apr 20 15:16:19.685222 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.685197 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" Apr 20 15:16:19.804722 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.804672 2571 generic.go:358] "Generic (PLEG): container finished" podID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerID="955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228" exitCode=6 Apr 20 15:16:19.804871 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.804770 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" Apr 20 15:16:19.804871 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.804770 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" event={"ID":"47eb57c4-3616-455a-b8b1-cc3133f42c3b","Type":"ContainerDied","Data":"955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228"} Apr 20 15:16:19.804871 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.804811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-472b4" event={"ID":"47eb57c4-3616-455a-b8b1-cc3133f42c3b","Type":"ContainerDied","Data":"aa18cbbf1de096f0b0256465ce0d0ba1085c794a9611ba1c98bb6ae909363922"} Apr 20 15:16:19.804871 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.804827 2571 scope.go:117] "RemoveContainer" containerID="955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228" Apr 20 15:16:19.812262 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.812247 2571 scope.go:117] "RemoveContainer" containerID="2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad" Apr 20 15:16:19.819128 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.819112 2571 scope.go:117] "RemoveContainer" containerID="955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228" Apr 20 15:16:19.819341 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:16:19.819321 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228\": container with ID starting with 955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228 not found: ID does not exist" containerID="955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228" Apr 20 15:16:19.819401 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.819354 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228"} err="failed to get container status \"955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228\": rpc error: code = NotFound desc = could not find container \"955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228\": container with ID starting with 955241d9be8ece5a6ddb5f4181514a65c3fc0f0ba35c7a1dca6b67c8618c6228 not found: ID does not exist" Apr 20 15:16:19.819401 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.819377 2571 scope.go:117] "RemoveContainer" containerID="2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad" Apr 20 15:16:19.819593 ip-10-0-141-9 kubenswrapper[2571]: E0420 15:16:19.819575 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad\": container with ID starting with 2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad not found: ID does not exist" containerID="2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad" Apr 20 15:16:19.819633 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.819600 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad"} err="failed to get container status \"2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad\": rpc error: code = NotFound desc = could not find container \"2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad\": container with ID starting with 2548ed400942f896ccf74bfd8338bc4f6289334fe03015230d877629ff66fcad not found: ID does not exist" Apr 20 15:16:19.846027 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.846007 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8j2b\" (UniqueName: \"kubernetes.io/projected/47eb57c4-3616-455a-b8b1-cc3133f42c3b-kube-api-access-f8j2b\") pod \"47eb57c4-3616-455a-b8b1-cc3133f42c3b\" (UID: \"47eb57c4-3616-455a-b8b1-cc3133f42c3b\") " Apr 20 15:16:19.848055 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.848023 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47eb57c4-3616-455a-b8b1-cc3133f42c3b-kube-api-access-f8j2b" (OuterVolumeSpecName: "kube-api-access-f8j2b") pod "47eb57c4-3616-455a-b8b1-cc3133f42c3b" (UID: "47eb57c4-3616-455a-b8b1-cc3133f42c3b"). InnerVolumeSpecName "kube-api-access-f8j2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:16:19.946763 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:19.946723 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8j2b\" (UniqueName: \"kubernetes.io/projected/47eb57c4-3616-455a-b8b1-cc3133f42c3b-kube-api-access-f8j2b\") on node \"ip-10-0-141-9.ec2.internal\" DevicePath \"\"" Apr 20 15:16:20.125604 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:20.125576 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-472b4"] Apr 20 15:16:20.127502 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:20.127480 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-472b4"] Apr 20 15:16:20.932519 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:16:20.932483 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" path="/var/lib/kubelet/pods/47eb57c4-3616-455a-b8b1-cc3133f42c3b/volumes" Apr 20 15:29:17.555586 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:17.555553 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-2w8jd_72f95425-c5fd-4adb-9d68-fb054a4682e8/manager/0.log" Apr 20 15:29:17.929043 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:17.928915 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-c52gz_c91ee7df-7568-45f0-81cf-3eceaa62d414/manager/2.log" Apr 20 15:29:18.051084 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:18.051023 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-99ff97f7d-7lwvx_791a6494-d0fc-4ab3-9d90-1d60f971189e/manager/0.log" Apr 20 15:29:18.388295 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:18.388263 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-swdnp_7f824afc-9875-412c-a2f6-539a0944856a/postgres/0.log" Apr 20 15:29:19.777747 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:19.777718 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-5hh7c_5fb4b079-9139-44f5-aa44-e44a368ab5f4/manager/0.log" Apr 20 15:29:20.902464 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:20.902430 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-wxmzg_cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b/discovery/0.log" Apr 20 15:29:21.011787 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:21.011755 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-85d448cc4f-8p9z7_f02d06d1-9f91-408c-8620-5c9999714030/kube-auth-proxy/0.log" Apr 20 15:29:25.961310 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961231 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8mfh8/must-gather-6wtz9"] Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961531 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961542 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961552 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961559 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961643 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961650 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961657 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961719 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.961781 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.961725 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="47eb57c4-3616-455a-b8b1-cc3133f42c3b" containerName="cleanup" Apr 20 15:29:25.964447 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.964427 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfh8/must-gather-6wtz9" Apr 20 15:29:25.966737 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.966712 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8mfh8\"/\"default-dockercfg-xm2g5\"" Apr 20 15:29:25.966857 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.966808 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8mfh8\"/\"openshift-service-ca.crt\"" Apr 20 15:29:25.967656 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.967640 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8mfh8\"/\"kube-root-ca.crt\"" Apr 20 15:29:25.977926 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:25.977905 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfh8/must-gather-6wtz9"] Apr 20 15:29:26.014192 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.014166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb4hl\" (UniqueName: \"kubernetes.io/projected/42775810-4273-4069-8253-b050e94f10d2-kube-api-access-vb4hl\") pod \"must-gather-6wtz9\" (UID: \"42775810-4273-4069-8253-b050e94f10d2\") " pod="openshift-must-gather-8mfh8/must-gather-6wtz9" Apr 20 15:29:26.014324 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.014236 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42775810-4273-4069-8253-b050e94f10d2-must-gather-output\") pod \"must-gather-6wtz9\" (UID: \"42775810-4273-4069-8253-b050e94f10d2\") " pod="openshift-must-gather-8mfh8/must-gather-6wtz9" Apr 20 15:29:26.114598 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.114562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4hl\" (UniqueName: \"kubernetes.io/projected/42775810-4273-4069-8253-b050e94f10d2-kube-api-access-vb4hl\") pod \"must-gather-6wtz9\" (UID: \"42775810-4273-4069-8253-b050e94f10d2\") " pod="openshift-must-gather-8mfh8/must-gather-6wtz9" Apr 20 15:29:26.114797 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.114651 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42775810-4273-4069-8253-b050e94f10d2-must-gather-output\") pod \"must-gather-6wtz9\" (UID: \"42775810-4273-4069-8253-b050e94f10d2\") " pod="openshift-must-gather-8mfh8/must-gather-6wtz9" Apr 20 15:29:26.114990 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.114968 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/42775810-4273-4069-8253-b050e94f10d2-must-gather-output\") pod \"must-gather-6wtz9\" (UID: \"42775810-4273-4069-8253-b050e94f10d2\") " pod="openshift-must-gather-8mfh8/must-gather-6wtz9" Apr 20 15:29:26.122603 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.122574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4hl\" (UniqueName: \"kubernetes.io/projected/42775810-4273-4069-8253-b050e94f10d2-kube-api-access-vb4hl\") pod \"must-gather-6wtz9\" (UID: \"42775810-4273-4069-8253-b050e94f10d2\") " pod="openshift-must-gather-8mfh8/must-gather-6wtz9" Apr 20 15:29:26.273765 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.273641 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfh8/must-gather-6wtz9" Apr 20 15:29:26.404794 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.404768 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfh8/must-gather-6wtz9"] Apr 20 15:29:26.407468 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:29:26.407434 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42775810_4273_4069_8253_b050e94f10d2.slice/crio-15d242f7da96cbff9fec5cd8433e3105306be4f096b6c2e986b779243974ab56 WatchSource:0}: Error finding container 15d242f7da96cbff9fec5cd8433e3105306be4f096b6c2e986b779243974ab56: Status 404 returned error can't find the container with id 15d242f7da96cbff9fec5cd8433e3105306be4f096b6c2e986b779243974ab56 Apr 20 15:29:26.409315 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:26.409291 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:29:27.216316 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:27.216278 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/must-gather-6wtz9" event={"ID":"42775810-4273-4069-8253-b050e94f10d2","Type":"ContainerStarted","Data":"15d242f7da96cbff9fec5cd8433e3105306be4f096b6c2e986b779243974ab56"} Apr 20 15:29:28.221878 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:28.221836 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/must-gather-6wtz9" event={"ID":"42775810-4273-4069-8253-b050e94f10d2","Type":"ContainerStarted","Data":"2dbd1483dc9942f6d6647f7cb136fd394292f20822e8b7579bbc1c39f8aff72b"} Apr 20 15:29:28.221878 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:28.221885 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/must-gather-6wtz9" event={"ID":"42775810-4273-4069-8253-b050e94f10d2","Type":"ContainerStarted","Data":"a133765d13f79851962f2fcde58ebd635865ea98ac8e7b63195e9390156b8b60"} Apr 20 15:29:28.237824 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:28.237778 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8mfh8/must-gather-6wtz9" podStartSLOduration=2.211260064 podStartE2EDuration="3.2377628s" podCreationTimestamp="2026-04-20 15:29:25 +0000 UTC" firstStartedPulling="2026-04-20 15:29:26.409432687 +0000 UTC m=+2060.003251727" lastFinishedPulling="2026-04-20 15:29:27.435935419 +0000 UTC m=+2061.029754463" observedRunningTime="2026-04-20 15:29:28.236542183 +0000 UTC m=+2061.830361248" watchObservedRunningTime="2026-04-20 15:29:28.2377628 +0000 UTC m=+2061.831581862" Apr 20 15:29:29.049104 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:29.049063 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gd648_78ea8d8a-7389-4822-92b4-41f9e8b474b9/global-pull-secret-syncer/0.log" Apr 20 15:29:29.155039 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:29.155006 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-v5dst_0827ffc7-2165-4812-b9cf-29976d74ffc2/konnectivity-agent/0.log" Apr 20 15:29:29.217678 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:29.217649 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-9.ec2.internal_e193ebdeb87d4b9c6bb9f329d9d23d3d/haproxy/0.log" Apr 20 15:29:33.546469 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:33.546439 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-5hh7c_5fb4b079-9139-44f5-aa44-e44a368ab5f4/manager/0.log" Apr 20 15:29:35.545566 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:35.545532 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzs5m_1d6abef2-4fbf-4446-985e-849ae60e2a9b/node-exporter/0.log" Apr 20 15:29:35.567771 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:35.567744 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzs5m_1d6abef2-4fbf-4446-985e-849ae60e2a9b/kube-rbac-proxy/0.log" Apr 20 15:29:35.591306 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:35.591276 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qzs5m_1d6abef2-4fbf-4446-985e-849ae60e2a9b/init-textfile/0.log" Apr 20 15:29:37.855821 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:37.855782 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8"] Apr 20 15:29:37.861166 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:37.861134 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:37.867947 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:37.867914 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8"] Apr 20 15:29:37.918248 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:37.918217 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98qsp\" (UniqueName: \"kubernetes.io/projected/b20ed830-ce70-4fb2-a2f2-de80be8c9483-kube-api-access-98qsp\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:37.918434 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:37.918270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-sys\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:37.918434 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:37.918324 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-lib-modules\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:37.918434 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:37.918392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-podres\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:37.918434 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:37.918428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-proc\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.019717 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019668 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-lib-modules\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.019887 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-podres\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.019887 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-proc\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.019887 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98qsp\" (UniqueName: \"kubernetes.io/projected/b20ed830-ce70-4fb2-a2f2-de80be8c9483-kube-api-access-98qsp\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.019887 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019822 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-sys\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.019887 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019863 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-lib-modules\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.020140 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019886 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-proc\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.020140 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019887 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-sys\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.020140 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.019885 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b20ed830-ce70-4fb2-a2f2-de80be8c9483-podres\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.028743 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.028714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98qsp\" (UniqueName: \"kubernetes.io/projected/b20ed830-ce70-4fb2-a2f2-de80be8c9483-kube-api-access-98qsp\") pod \"perf-node-gather-daemonset-ckrc8\" (UID: \"b20ed830-ce70-4fb2-a2f2-de80be8c9483\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.176784 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.176310 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:38.336231 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:38.334262 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8"] Apr 20 15:29:38.337423 ip-10-0-141-9 kubenswrapper[2571]: W0420 15:29:38.337382 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb20ed830_ce70_4fb2_a2f2_de80be8c9483.slice/crio-57f3106ee9746c19bccfbf0e0e69e923b8d9274b4a1419e8221a81352bbefd96 WatchSource:0}: Error finding container 57f3106ee9746c19bccfbf0e0e69e923b8d9274b4a1419e8221a81352bbefd96: Status 404 returned error can't find the container with id 57f3106ee9746c19bccfbf0e0e69e923b8d9274b4a1419e8221a81352bbefd96 Apr 20 15:29:39.296833 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:39.296800 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" event={"ID":"b20ed830-ce70-4fb2-a2f2-de80be8c9483","Type":"ContainerStarted","Data":"e4e1a9e3ad40f2614ace8c00a4cb23f5dc436d52a653f2344f247febfc032a35"} Apr 20 15:29:39.297407 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:39.297352 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" event={"ID":"b20ed830-ce70-4fb2-a2f2-de80be8c9483","Type":"ContainerStarted","Data":"57f3106ee9746c19bccfbf0e0e69e923b8d9274b4a1419e8221a81352bbefd96"} Apr 20 15:29:39.298478 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:39.298442 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:39.314363 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:39.314313 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" podStartSLOduration=2.31429815 podStartE2EDuration="2.31429815s" podCreationTimestamp="2026-04-20 15:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:29:39.312779702 +0000 UTC m=+2072.906598767" watchObservedRunningTime="2026-04-20 15:29:39.31429815 +0000 UTC m=+2072.908117221" Apr 20 15:29:39.534215 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:39.534187 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8vlv5_f51becfe-6707-48a0-8930-b1feea33fb21/dns/0.log" Apr 20 15:29:39.554500 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:39.554400 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8vlv5_f51becfe-6707-48a0-8930-b1feea33fb21/kube-rbac-proxy/0.log" Apr 20 15:29:39.659412 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:39.659386 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-58fm4_b535020a-3ebe-44bb-8180-63bb281aceff/dns-node-resolver/0.log" Apr 20 15:29:40.144867 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:40.144837 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gk5rl_7be4d4a0-b5c2-4857-a3ae-245ad4430c7c/node-ca/0.log" Apr 20 15:29:41.149629 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:41.149596 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-wxmzg_cdfe2eff-f0b0-4f55-8a5c-ced2cd895e3b/discovery/0.log" Apr 20 15:29:41.170049 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:41.170025 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-85d448cc4f-8p9z7_f02d06d1-9f91-408c-8620-5c9999714030/kube-auth-proxy/0.log" Apr 20 15:29:41.869870 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:41.869824 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ktxxq_e0053f0d-ea66-4e0b-950d-23c42c995f23/serve-healthcheck-canary/0.log" Apr 20 15:29:42.291731 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:42.291635 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fsfq2_b19cde52-4a17-4909-9fca-1ee609bd3a49/kube-rbac-proxy/0.log" Apr 20 15:29:42.310383 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:42.310356 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fsfq2_b19cde52-4a17-4909-9fca-1ee609bd3a49/exporter/0.log" Apr 20 15:29:42.329430 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:42.329405 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fsfq2_b19cde52-4a17-4909-9fca-1ee609bd3a49/extractor/0.log" Apr 20 15:29:44.390864 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:44.390830 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-2w8jd_72f95425-c5fd-4adb-9d68-fb054a4682e8/manager/0.log" Apr 20 15:29:44.493471 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:44.493439 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-c52gz_c91ee7df-7568-45f0-81cf-3eceaa62d414/manager/1.log" Apr 20 15:29:44.516590 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:44.516549 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-c52gz_c91ee7df-7568-45f0-81cf-3eceaa62d414/manager/2.log" Apr 20 15:29:44.573302 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:44.573275 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-99ff97f7d-7lwvx_791a6494-d0fc-4ab3-9d90-1d60f971189e/manager/0.log" Apr 20 15:29:44.641926 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:44.641850 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-swdnp_7f824afc-9875-412c-a2f6-539a0944856a/postgres/0.log" Apr 20 15:29:46.314660 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:46.314634 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-ckrc8" Apr 20 15:29:51.604749 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.604723 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kfbrh_2e10015e-64f6-4b90-b27b-5d53c810c05d/kube-multus-additional-cni-plugins/0.log" Apr 20 15:29:51.623397 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.623364 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kfbrh_2e10015e-64f6-4b90-b27b-5d53c810c05d/egress-router-binary-copy/0.log" Apr 20 15:29:51.642561 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.642530 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kfbrh_2e10015e-64f6-4b90-b27b-5d53c810c05d/cni-plugins/0.log" Apr 20 15:29:51.661452 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.661430 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kfbrh_2e10015e-64f6-4b90-b27b-5d53c810c05d/bond-cni-plugin/0.log" Apr 20 15:29:51.681030 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.681006 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kfbrh_2e10015e-64f6-4b90-b27b-5d53c810c05d/routeoverride-cni/0.log" Apr 20 15:29:51.699994 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.699968 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kfbrh_2e10015e-64f6-4b90-b27b-5d53c810c05d/whereabouts-cni-bincopy/0.log" Apr 20 15:29:51.716790 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.716761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kfbrh_2e10015e-64f6-4b90-b27b-5d53c810c05d/whereabouts-cni/0.log" Apr 20 15:29:51.926165 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.926086 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t2gsq_026bd687-3320-46f1-b7ea-f615e5b5a821/kube-multus/0.log" Apr 20 15:29:51.972050 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.972017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7v79q_d1dafe36-2ae8-4593-82df-fbff4eee87b1/network-metrics-daemon/0.log" Apr 20 15:29:51.989810 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:51.989759 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7v79q_d1dafe36-2ae8-4593-82df-fbff4eee87b1/kube-rbac-proxy/0.log" Apr 20 15:29:53.370148 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:53.370119 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qpw9h_febe8a99-bbc0-4ad0-9eb4-512c729e11c3/ovn-controller/0.log" Apr 20 15:29:53.407516 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:53.407480 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qpw9h_febe8a99-bbc0-4ad0-9eb4-512c729e11c3/ovn-acl-logging/0.log" Apr 20 15:29:53.426589 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:53.426558 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qpw9h_febe8a99-bbc0-4ad0-9eb4-512c729e11c3/kube-rbac-proxy-node/0.log" Apr 20 15:29:53.447348 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:53.447314 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qpw9h_febe8a99-bbc0-4ad0-9eb4-512c729e11c3/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:29:53.463427 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:53.463401 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qpw9h_febe8a99-bbc0-4ad0-9eb4-512c729e11c3/northd/0.log" Apr 20 15:29:53.483328 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:53.483301 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qpw9h_febe8a99-bbc0-4ad0-9eb4-512c729e11c3/nbdb/0.log" Apr 20 15:29:53.504587 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:53.504558 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qpw9h_febe8a99-bbc0-4ad0-9eb4-512c729e11c3/sbdb/0.log" Apr 20 15:29:53.680633 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:53.680564 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qpw9h_febe8a99-bbc0-4ad0-9eb4-512c729e11c3/ovnkube-controller/0.log" Apr 20 15:29:54.757289 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:54.757254 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-x9fss_cb271ee0-fe50-4ec5-a58b-e4cde09671b7/network-check-target-container/0.log" Apr 20 15:29:55.814995 ip-10-0-141-9 kubenswrapper[2571]: I0420 15:29:55.814964 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-dqx88_2b9613e1-9a91-40ed-9ec1-4fcfa4ec06af/iptables-alerter/0.log"