Apr 16 14:52:09.511514 ip-10-0-130-229 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:09.968391 ip-10-0-130-229 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:09.968391 ip-10-0-130-229 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:09.968391 ip-10-0-130-229 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:09.968391 ip-10-0-130-229 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:09.968391 ip-10-0-130-229 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:09.969086 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.969018 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:09.973677 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973663 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:09.973677 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973677 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973681 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973684 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973687 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973690 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973692 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973695 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973697 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973700 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973704 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973706 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973709 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973712 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973714 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973722 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973725 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973728 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973731 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973733 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973736 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:09.973736 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973739 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973741 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973744 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973747 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973750 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973753 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973755 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973758 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973760 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973763 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973765 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973768 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973770 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973773 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973775 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973778 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973780 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973782 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973785 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:09.974196 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973787 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973790 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973792 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973795 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973797 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973800 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973802 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973806 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973809 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973813 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973815 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973818 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973820 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973823 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973826 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973828 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973831 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973834 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973836 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973839 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:09.974662 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973841 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973844 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973846 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973849 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973852 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973855 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973857 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973860 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973862 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973865 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973867 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973870 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973874 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973878 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973881 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973883 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973886 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973888 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973891 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973893 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:09.975128 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973896 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973899 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973901 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973904 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973908 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.973911 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974274 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974279 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974284 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974287 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974290 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974293 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974296 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974298 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974301 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974304 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974306 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974309 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974311 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:09.975599 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974314 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974317 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974319 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974322 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974324 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974326 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974329 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974332 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974334 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974337 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974339 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974341 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974344 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974348 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974351 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974354 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974356 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974359 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974361 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974364 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:09.976054 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974367 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974369 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974372 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974374 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974378 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974381 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974384 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974387 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974389 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974392 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974395 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974398 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974401 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974404 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974406 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974409 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974411 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974415 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974417 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:09.976546 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974419 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974422 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974424 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974427 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974429 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974432 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974435 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974437 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974440 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974442 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974445 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974447 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974451 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974453 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974455 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974458 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974460 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974463 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974465 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974468 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:09.976999 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974470 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974472 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974475 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974478 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974480 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974482 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974485 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974488 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974490 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974493 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974495 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974497 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974500 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.974502 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974569 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974576 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974582 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974589 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974593 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974596 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974600 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:09.977510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974605 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974608 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974611 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974615 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974619 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974622 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974625 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974628 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974631 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974633 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974636 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974639 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974643 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974646 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974649 2575 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974652 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974655 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974659 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974663 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974666 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974669 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974673 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974676 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974679 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974683 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:09.978001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974686 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974690 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974693 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974696 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974699 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974702 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974705 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974709 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974713 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974716 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974719 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974722 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974726 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974729 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974732 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974735 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974738 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974740 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974743 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974746 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974749 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974752 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974754 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974758 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974761 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:09.978629 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974764 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974767 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974771 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974774 2575 flags.go:64] FLAG: --help="false" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974777 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-130-229.ec2.internal" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974780 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974783 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974786 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974789 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974792 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974795 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974798 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974801 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974803 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974806 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974809 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974812 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974817 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974820 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974823 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974826 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974829 2575 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974832 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974835 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:09.979195 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974837 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974843 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974845 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974848 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974851 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974854 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974857 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974860 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974864 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974868 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974871 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974875 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974879 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974882 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974885 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974888 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974891 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974893 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974896 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974903 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974906 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974909 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974912 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:09.979773 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974915 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974921 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974925 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974928 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974932 2575 flags.go:64] FLAG: --port="10250" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974934 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974937 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-065d03c90b53b7517" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974940 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974943 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974946 2575 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974949 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974951 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974955 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974957 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974960 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974963 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974967 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974970 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974973 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974975 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974979 2575 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974982 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974985 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974988 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974991 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974994 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:09.980350 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.974997 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975000 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975003 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975006 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975008 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975011 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975014 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975017 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975021 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975024 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975029 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975033 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975035 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975040 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975043 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975045 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975048 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975051 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975054 2575 flags.go:64] FLAG: --v="2" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975058 2575 flags.go:64] FLAG: --version="false" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975062 2575 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975066 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975069 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975153 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975156 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:09.980942 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975159 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975164 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975167 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975170 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975172 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975175 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975178 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975180 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975183 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975186 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975188 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975191 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975193 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975196 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975198 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975206 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975223 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975226 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975229 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:09.981580 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975232 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975234 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975237 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975239 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975242 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975244 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975247 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975249 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975252 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975255 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975257 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975260 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975263 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975265 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975269 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975271 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975274 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975277 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975281 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:09.982335 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975284 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975287 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975290 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975292 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975295 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975298 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975300 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975303 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975305 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975309 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975312 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975315 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975318 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975321 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975323 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975326 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975328 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975330 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975333 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975335 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:09.982850 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975338 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975341 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975343 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975346 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975348 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975351 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975353 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975358 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975361 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975363 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975366 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975368 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975371 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975373 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975376 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975378 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975381 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975383 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975386 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975388 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:09.983381 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975390 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975394 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975397 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975399 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975402 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.975405 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.975850 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.983659 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.983675 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983722 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983726 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983729 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983733 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983736 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983739 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:09.983892 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983742 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983744 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983748 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983753 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983756 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983759 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983762 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983765 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983767 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983770 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983773 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983776 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983779 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983782 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983784 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983787 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983790 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983792 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983795 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:09.984267 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983797 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983800 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983803 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983805 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983808 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983810 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983814 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983817 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983819 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983822 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983824 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983827 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983838 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983841 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983844 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983846 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983849 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983852 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983855 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983857 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:09.984716 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983860 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983862 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983865 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983867 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983870 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983873 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983876 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983879 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983881 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983884 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983886 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983889 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983891 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983894 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983896 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983899 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983902 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983904 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983908 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983912 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:09.985193 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983916 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983919 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983922 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983924 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983927 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983930 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983932 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983934 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983937 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983939 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983942 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983945 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983948 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983950 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983953 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983955 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983958 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983960 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983963 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983965 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:09.985674 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.983968 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.983973 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984080 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984086 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984091 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984094 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984097 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984100 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984103 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984105 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984108 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984111 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984114 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984116 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984119 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:09.986147 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984121 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984124 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984126 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984129 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984131 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984134 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984136 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984139 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984141 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984145 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984147 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984150 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984152 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984155 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984158 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984160 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984162 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984165 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984167 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984170 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:09.986506 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984172 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984174 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984177 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984179 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984182 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984185 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984188 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984190 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984193 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984197 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984200 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984203 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984205 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984226 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984229 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984232 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984235 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984237 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984240 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:09.986978 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984242 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984245 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984247 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984251 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984255 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984257 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984260 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984262 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984265 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984267 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984270 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984272 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984275 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984278 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984281 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984283 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984285 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984288 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984291 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984293 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:09.987448 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984296 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984298 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984301 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984304 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984306 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984309 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984311 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984314 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984316 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984319 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984321 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984323 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984326 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:09.984328 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.984333 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:09.987945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.985110 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:09.988358 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.986887 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:09.988358 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.987803 2575 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:09.988358 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.987894 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:09.988358 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:09.987934 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:10.010681 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.010665 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:10.013438 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.013418 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:10.029776 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.029756 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:10.034674 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.034662 2575 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:10.036222 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.036174 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:10.040384 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.040361 2575 fs.go:135] Filesystem UUIDs: map[20b2a3cd-dd44-4c57-94a6-ba49f76c112e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e1390e88-43cf-4e7f-ab28-c065c6849f39:/dev/nvme0n1p3] Apr 16 14:52:10.040975 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.040956 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:10.041624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.041607 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:10.046280 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.046159 2575 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:10.044458632 +0000 UTC m=+0.413741114 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199781 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f8bc37c85ead3e9e8782a1dbcf90b SystemUUID:ec2f8bc3-7c85-ead3-e9e8-782a1dbcf90b BootID:1367c910-953f-4800-9b62-8bdfb62cca72 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8a:f3:6c:33:11 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8a:f3:6c:33:11 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:9f:0d:ae:be:2f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:10.046280 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.046275 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:10.046389 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.046345 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:10.048722 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.048695 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:10.048892 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.048724 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-229.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:10.048935 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.048901 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:10.048935 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.048910 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:10.048935 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.048928 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:10.049696 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.049686 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:10.051337 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.051327 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:10.051579 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.051569 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:10.053615 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.053606 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:10.053646 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.053621 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:10.053646 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.053633 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:10.053646 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.053641 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:10.053752 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.053649 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:10.054889 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.054877 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:10.054931 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.054898 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:10.057477 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.057462 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:10.059110 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.059093 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:10.060356 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060344 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:10.060403 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060359 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:10.060403 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060366 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:10.060403 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060372 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:10.060403 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060377 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:10.060403 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060383 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:10.060403 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060388 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:10.060403 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060402 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:10.060576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060409 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:10.060576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060415 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:10.060576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060426 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:10.060576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.060435 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:10.062098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.062085 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:10.062136 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.062101 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:10.065690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.065677 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:10.065769 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.065733 2575 server.go:1295] "Started kubelet" Apr 16 14:52:10.065889 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.065830 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:10.065944 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.065850 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:10.065944 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.065930 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:10.066587 ip-10-0-130-229 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:10.066936 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.066916 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:10.068557 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.068543 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:10.069073 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.069046 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-229.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:10.069638 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.069614 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:10.069715 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.069613 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-229.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:10.071657 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.071640 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:10.072125 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.072113 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:10.072731 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.072712 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:10.072731 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.072715 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:10.072853 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.072741 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:10.072853 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.072829 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:10.072853 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.072837 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:10.075392 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.075363 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.075717 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.075702 2575 factory.go:55] Registering systemd factory Apr 16 14:52:10.075791 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.075764 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:10.076742 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.076725 2575 factory.go:153] Registering CRI-O factory Apr 16 14:52:10.076827 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.076754 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:10.076878 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.076851 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:10.076878 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.076877 2575 factory.go:103] Registering Raw factory Apr 16 14:52:10.076972 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.076896 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:10.077417 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.077401 2575 manager.go:319] Starting recovery of all containers Apr 16 14:52:10.077600 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.077578 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:10.081994 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.081966 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mdf6b" Apr 16 14:52:10.082706 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.082550 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-229.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 14:52:10.082706 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.082577 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 14:52:10.083675 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.082615 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-229.ec2.internal.18a6ddf0f7edd710 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-229.ec2.internal,UID:ip-10-0-130-229.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-229.ec2.internal,},FirstTimestamp:2026-04-16 14:52:10.065688336 +0000 UTC m=+0.434970799,LastTimestamp:2026-04-16 14:52:10.065688336 +0000 UTC m=+0.434970799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-229.ec2.internal,}" Apr 16 14:52:10.083915 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.083875 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:10.088154 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.088133 2575 manager.go:324] Recovery completed Apr 16 14:52:10.088561 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.088540 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mdf6b" Apr 16 14:52:10.092453 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.092437 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:10.095053 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.095039 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:10.095109 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.095065 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:10.095109 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.095077 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:10.095572 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.095558 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:10.095572 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.095570 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:10.095668 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.095583 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:10.097727 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.097711 2575 policy_none.go:49] "None policy: Start" Apr 16 14:52:10.097727 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.097726 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:10.097863 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.097736 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.132386 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.132415 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.132424 2575 server.go:85] "Starting device plugin registration server" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.132604 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.132614 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.132716 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.132791 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.132800 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.133152 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:10.146283 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.133201 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.200168 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.200144 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:10.200168 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.200171 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:10.200299 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.200186 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:10.200299 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.200194 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:10.200299 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.200243 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:10.202805 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.202790 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:10.233253 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.233218 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:10.234038 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.234023 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:10.234102 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.234050 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:10.234102 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.234061 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:10.234102 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.234081 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.242102 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.242090 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.242147 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.242108 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-229.ec2.internal\": node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.255143 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.255126 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.300899 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.300867 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal"] Apr 16 14:52:10.300957 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.300934 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:10.302193 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.302181 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:10.302271 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.302202 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:10.302271 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.302225 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:10.303280 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.303268 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:10.303430 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.303416 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.303476 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.303443 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:10.303874 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.303852 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:10.303941 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.303879 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:10.303941 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.303888 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:10.304001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.303885 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:10.304001 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.303993 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:10.304054 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.304004 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:10.305146 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.305134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.305254 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.305156 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:10.305747 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.305732 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:10.305747 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.305759 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:10.305899 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.305771 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:10.326805 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.326785 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-229.ec2.internal\" not found" node="ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.330822 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.330808 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-229.ec2.internal\" not found" node="ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.355764 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.355745 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.373324 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.373302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/10023b7ca0aac89d793122436ad8f10e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal\" (UID: \"10023b7ca0aac89d793122436ad8f10e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.373401 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.373329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10023b7ca0aac89d793122436ad8f10e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal\" (UID: \"10023b7ca0aac89d793122436ad8f10e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.373401 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.373345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e289990daac467a09b205e75b6286d5f-config\") pod \"kube-apiserver-proxy-ip-10-0-130-229.ec2.internal\" (UID: \"e289990daac467a09b205e75b6286d5f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.456025 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.456006 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.474307 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.474291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/10023b7ca0aac89d793122436ad8f10e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal\" (UID: \"10023b7ca0aac89d793122436ad8f10e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.474373 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.474315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10023b7ca0aac89d793122436ad8f10e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal\" (UID: \"10023b7ca0aac89d793122436ad8f10e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.474373 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.474332 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e289990daac467a09b205e75b6286d5f-config\") pod \"kube-apiserver-proxy-ip-10-0-130-229.ec2.internal\" (UID: \"e289990daac467a09b205e75b6286d5f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.474442 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.474397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/10023b7ca0aac89d793122436ad8f10e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal\" (UID: \"10023b7ca0aac89d793122436ad8f10e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.474475 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.474448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10023b7ca0aac89d793122436ad8f10e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal\" (UID: \"10023b7ca0aac89d793122436ad8f10e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.474475 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.474448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e289990daac467a09b205e75b6286d5f-config\") pod \"kube-apiserver-proxy-ip-10-0-130-229.ec2.internal\" (UID: \"e289990daac467a09b205e75b6286d5f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.556580 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.556540 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.630027 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.630002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.633740 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.633725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" Apr 16 14:52:10.656844 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.656813 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.757272 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.757250 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.857842 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.857792 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.958362 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:10.958338 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:10.988876 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.988851 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:10.989440 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:10.988970 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:11.004729 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.004711 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:11.059397 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:11.059365 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:11.072066 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.072048 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:11.082379 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.082361 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:11.089821 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.089794 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:10 +0000 UTC" deadline="2027-12-18 12:32:17.569350778 +0000 UTC" Apr 16 14:52:11.089821 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.089819 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14661h40m6.479534416s" Apr 16 14:52:11.103692 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.103676 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wv9jg" Apr 16 14:52:11.109016 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.108979 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wv9jg" Apr 16 14:52:11.132568 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:11.132546 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode289990daac467a09b205e75b6286d5f.slice/crio-40372b4f69c86b969ba9628975466abc0a12b8f9ec666f1903910bc151f401f5 WatchSource:0}: Error finding container 40372b4f69c86b969ba9628975466abc0a12b8f9ec666f1903910bc151f401f5: Status 404 returned error can't find the container with id 40372b4f69c86b969ba9628975466abc0a12b8f9ec666f1903910bc151f401f5 Apr 16 14:52:11.133199 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:11.133178 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10023b7ca0aac89d793122436ad8f10e.slice/crio-a027595292a7393a966e7e7ffc8e6284f1a334b3812ee638286ac837f99ffbf1 WatchSource:0}: Error finding container a027595292a7393a966e7e7ffc8e6284f1a334b3812ee638286ac837f99ffbf1: Status 404 returned error can't find the container with id a027595292a7393a966e7e7ffc8e6284f1a334b3812ee638286ac837f99ffbf1 Apr 16 14:52:11.137029 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.137016 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:11.149173 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.149154 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:11.160114 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:11.160096 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:11.203390 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.203351 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" event={"ID":"10023b7ca0aac89d793122436ad8f10e","Type":"ContainerStarted","Data":"a027595292a7393a966e7e7ffc8e6284f1a334b3812ee638286ac837f99ffbf1"} Apr 16 14:52:11.204326 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.204308 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" event={"ID":"e289990daac467a09b205e75b6286d5f","Type":"ContainerStarted","Data":"40372b4f69c86b969ba9628975466abc0a12b8f9ec666f1903910bc151f401f5"} Apr 16 14:52:11.260499 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:11.260479 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:11.360983 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:11.360929 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-229.ec2.internal\" not found" Apr 16 14:52:11.424868 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.424847 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:11.473805 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.473587 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" Apr 16 14:52:11.485766 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.485731 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:11.486931 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.486774 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" Apr 16 14:52:11.495193 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:11.495084 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:12.055143 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.055116 2575 apiserver.go:52] "Watching apiserver" Apr 16 14:52:12.066514 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.066489 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:12.067551 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.067520 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf","openshift-cluster-node-tuning-operator/tuned-b64tn","openshift-dns/node-resolver-4vrpg","openshift-image-registry/node-ca-pnmnk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal","openshift-multus/multus-additional-cni-plugins-9pkk7","openshift-multus/multus-sg7qn","kube-system/konnectivity-agent-cfqtw","kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal","openshift-multus/network-metrics-daemon-qf2x7","openshift-network-diagnostics/network-check-target-85ccl","openshift-network-operator/iptables-alerter-cpd95","openshift-ovn-kubernetes/ovnkube-node-75cxs","kube-system/global-pull-secret-syncer-kjjnh"] Apr 16 14:52:12.068746 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.068720 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.070678 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.070640 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:12.070765 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.070739 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:12.071681 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.071659 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:12.071681 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.071671 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.071830 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.071704 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:12.071830 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.071766 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:12.071830 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.071823 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:12.072049 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.072030 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jb7vq\"" Apr 16 14:52:12.072876 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.072855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.076516 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.075900 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:12.076516 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.076193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xftsh\"" Apr 16 14:52:12.076516 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.076410 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:12.076516 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.076468 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:12.076738 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.076547 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:12.076845 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.076828 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:12.076967 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.076951 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:12.077023 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.076972 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b5zzr\"" Apr 16 14:52:12.078093 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.078077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.078236 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.078206 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.078977 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.078959 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:12.079457 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.079300 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-26c42\"" Apr 16 14:52:12.079457 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.079393 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:12.079665 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.079645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.080425 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.080405 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:12.080777 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.080759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.081188 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.080925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-system-cni-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.081188 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.080960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-socket-dir-parent\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.081188 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.080985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-multus-certs\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.081188 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/497451ed-b0c8-4109-a9ab-9689ebba5dba-agent-certs\") pod \"konnectivity-agent-cfqtw\" (UID: \"497451ed-b0c8-4109-a9ab-9689ebba5dba\") " pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:12.081188 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081143 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:12.081188 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/497451ed-b0c8-4109-a9ab-9689ebba5dba-konnectivity-ca\") pod \"konnectivity-agent-cfqtw\" (UID: \"497451ed-b0c8-4109-a9ab-9689ebba5dba\") " pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081287 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6bv8c\"" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081417 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fj4s8\"" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-os-release\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-hostroot\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38c4c3ee-509f-467b-b050-9b723bfce014-host\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081536 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-k8s-cni-cncf-io\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081549 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:12.081564 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-kubelet\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081628 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-hosts-file\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081632 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhmzg\" (UniqueName: \"kubernetes.io/projected/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-kube-api-access-jhmzg\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbqp\" (UniqueName: \"kubernetes.io/projected/38c4c3ee-509f-467b-b050-9b723bfce014-kube-api-access-fsbqp\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpd9h\" (UniqueName: \"kubernetes.io/projected/e0f509b4-277a-43cc-ab72-a486e31674af-kube-api-access-gpd9h\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-cnibin\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-cni-bin\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-os-release\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-cni-multus\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-tmp-dir\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.081966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74f9607d-cf87-4aa2-af48-8f1cbac463ed-cni-binary-copy\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-conf-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-etc-kubernetes\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082078 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-system-cni-dir\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082108 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.082167 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-cnibin\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-cni-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082264 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-netns\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-daemon-config\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzm2\" (UniqueName: \"kubernetes.io/projected/74f9607d-cf87-4aa2-af48-8f1cbac463ed-kube-api-access-hqzm2\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/38c4c3ee-509f-467b-b050-9b723bfce014-serviceca\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082373 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082454 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082539 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082583 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:12.082800 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.082707 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:12.083565 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.083546 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.083998 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.083969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rkf9w\"" Apr 16 14:52:12.083998 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.083982 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:12.083998 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.083988 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:12.084176 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.084034 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pnhl7\"" Apr 16 14:52:12.084294 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.084278 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:12.084771 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.084757 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.084821 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.084803 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:12.086041 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.086024 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jzmn\"" Apr 16 14:52:12.086196 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.086182 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:12.086364 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.086307 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:12.086457 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.086442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:12.110003 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.109964 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:11 +0000 UTC" deadline="2028-01-17 12:45:31.524762146 +0000 UTC" Apr 16 14:52:12.110003 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.109992 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15381h53m19.414774547s" Apr 16 14:52:12.173643 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.173624 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:12.182610 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.182699 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysconfig\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.182699 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-socket-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.182699 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-os-release\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.182831 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-hostroot\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.182831 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-os-release\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.182831 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-hostroot\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.182831 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38c4c3ee-509f-467b-b050-9b723bfce014-host\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/38c4c3ee-509f-467b-b050-9b723bfce014-host\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-etc-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182899 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-kubelet\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-hosts-file\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhmzg\" (UniqueName: \"kubernetes.io/projected/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-kube-api-access-jhmzg\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.182995 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbqp\" (UniqueName: \"kubernetes.io/projected/38c4c3ee-509f-467b-b050-9b723bfce014-kube-api-access-fsbqp\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-run-netns\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-hosts-file\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-sys-fs\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-cnibin\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-os-release\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.182982 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-kubelet\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c5s9\" (UniqueName: \"kubernetes.io/projected/e9568559-bb27-4f69-aa6e-e169bbfd3048-kube-api-access-4c5s9\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-log-socket\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-cnibin\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-os-release\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-cni-multus\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183256 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-cni-multus\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-run-ovn-kubernetes\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-registration-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-etc-selinux\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183339 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-conf-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.183419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-cni-netd\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-sys\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-conf-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-host\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-cni-binary-copy\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-netns\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzm2\" (UniqueName: \"kubernetes.io/projected/74f9607d-cf87-4aa2-af48-8f1cbac463ed-kube-api-access-hqzm2\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-netns\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-systemd-units\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-modprobe-d\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183614 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-kubernetes\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/38c4c3ee-509f-467b-b050-9b723bfce014-serviceca\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-kubelet\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-var-lib-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysctl-conf\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-system-cni-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184135 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-socket-dir-parent\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/497451ed-b0c8-4109-a9ab-9689ebba5dba-agent-certs\") pod \"konnectivity-agent-cfqtw\" (UID: \"497451ed-b0c8-4109-a9ab-9689ebba5dba\") " pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/497451ed-b0c8-4109-a9ab-9689ebba5dba-konnectivity-ca\") pod \"konnectivity-agent-cfqtw\" (UID: \"497451ed-b0c8-4109-a9ab-9689ebba5dba\") " pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-system-cni-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-socket-dir-parent\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0892e381-08bb-4454-99af-9dd414b35525-dbus\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.183990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-slash\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-tuned\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0892e381-08bb-4454-99af-9dd414b35525-kubelet-config\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/38c4c3ee-509f-467b-b050-9b723bfce014-serviceca\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-ovn\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-cni-bin\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-var-lib-kubelet\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184153 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-k8s-cni-cncf-io\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.184908 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184151 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-k8s-cni-cncf-io\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpd9h\" (UniqueName: \"kubernetes.io/projected/e0f509b4-277a-43cc-ab72-a486e31674af-kube-api-access-gpd9h\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-cni-bin\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-node-log\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-env-overrides\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-ovnkube-script-lib\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysctl-d\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-var-lib-cni-bin\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184344 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjb9\" (UniqueName: \"kubernetes.io/projected/012eecca-9f9b-4a13-8adc-05b585fd794b-kube-api-access-pzjb9\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-tmp-dir\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/497451ed-b0c8-4109-a9ab-9689ebba5dba-konnectivity-ca\") pod \"konnectivity-agent-cfqtw\" (UID: \"497451ed-b0c8-4109-a9ab-9689ebba5dba\") " pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9568559-bb27-4f69-aa6e-e169bbfd3048-host-slash\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184454 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrphc\" (UniqueName: \"kubernetes.io/projected/63b14197-55a5-4407-8c24-397ab7006750-kube-api-access-qrphc\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39650afc-ce1c-4648-83c6-5b4969c0db6a-tmp\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-device-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.185526 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74f9607d-cf87-4aa2-af48-8f1cbac463ed-cni-binary-copy\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-etc-kubernetes\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-tmp-dir\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-system-cni-dir\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-etc-kubernetes\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-system-cni-dir\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-cnibin\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-systemd\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-cnibin\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/63b14197-55a5-4407-8c24-397ab7006750-ovn-node-metrics-cert\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-run\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-cni-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-daemon-config\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-ovnkube-config\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-systemd\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lwnq\" (UniqueName: \"kubernetes.io/projected/7822caac-b450-425a-bec0-981f5e05c867-kube-api-access-5lwnq\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.186334 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-cni-dir\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9568559-bb27-4f69-aa6e-e169bbfd3048-iptables-alerter-script\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-lib-modules\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.184999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlcz\" (UniqueName: \"kubernetes.io/projected/39650afc-ce1c-4648-83c6-5b4969c0db6a-kube-api-access-nmlcz\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.185022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74f9607d-cf87-4aa2-af48-8f1cbac463ed-cni-binary-copy\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.185044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-multus-certs\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.185091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74f9607d-cf87-4aa2-af48-8f1cbac463ed-host-run-multus-certs\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.185119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.185199 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e0f509b4-277a-43cc-ab72-a486e31674af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.185333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74f9607d-cf87-4aa2-af48-8f1cbac463ed-multus-daemon-config\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.187071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.185564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e0f509b4-277a-43cc-ab72-a486e31674af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.187582 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.187566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/497451ed-b0c8-4109-a9ab-9689ebba5dba-agent-certs\") pod \"konnectivity-agent-cfqtw\" (UID: \"497451ed-b0c8-4109-a9ab-9689ebba5dba\") " pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:12.189563 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.189475 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:12.189563 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.189497 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:12.189563 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.189510 2575 projected.go:194] Error preparing data for projected volume kube-api-access-qqc9z for pod openshift-network-diagnostics/network-check-target-85ccl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:12.189872 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.189568 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z podName:4a74642f-0b47-4e56-931c-041808066f04 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:12.68954975 +0000 UTC m=+3.058832200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qqc9z" (UniqueName: "kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z") pod "network-check-target-85ccl" (UID: "4a74642f-0b47-4e56-931c-041808066f04") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:12.191919 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.191897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbqp\" (UniqueName: \"kubernetes.io/projected/38c4c3ee-509f-467b-b050-9b723bfce014-kube-api-access-fsbqp\") pod \"node-ca-pnmnk\" (UID: \"38c4c3ee-509f-467b-b050-9b723bfce014\") " pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.192324 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.192282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpd9h\" (UniqueName: \"kubernetes.io/projected/e0f509b4-277a-43cc-ab72-a486e31674af-kube-api-access-gpd9h\") pod \"multus-additional-cni-plugins-9pkk7\" (UID: \"e0f509b4-277a-43cc-ab72-a486e31674af\") " pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.192711 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.192690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhmzg\" (UniqueName: \"kubernetes.io/projected/55ad67c5-7bb1-4c6c-8c58-869beff80d7f-kube-api-access-jhmzg\") pod \"node-resolver-4vrpg\" (UID: \"55ad67c5-7bb1-4c6c-8c58-869beff80d7f\") " pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.192798 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.192731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzm2\" (UniqueName: \"kubernetes.io/projected/74f9607d-cf87-4aa2-af48-8f1cbac463ed-kube-api-access-hqzm2\") pod \"multus-sg7qn\" (UID: \"74f9607d-cf87-4aa2-af48-8f1cbac463ed\") " pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.285551 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-etc-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-etc-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-run-netns\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-sys-fs\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285623 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c5s9\" (UniqueName: \"kubernetes.io/projected/e9568559-bb27-4f69-aa6e-e169bbfd3048-kube-api-access-4c5s9\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-run-netns\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-log-socket\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-sys-fs\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-log-socket\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.285690 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-run-ovn-kubernetes\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-registration-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-etc-selinux\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-cni-netd\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-sys\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-registration-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-cni-netd\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-sys\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-host\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-systemd-units\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-host\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-etc-selinux\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-modprobe-d\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-kubernetes\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285911 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-systemd-units\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-run-ovn-kubernetes\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-kubelet\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286111 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-kubelet\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-kubernetes\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.285999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-var-lib-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysctl-conf\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-modprobe-d\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-var-lib-openvswitch\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0892e381-08bb-4454-99af-9dd414b35525-dbus\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-slash\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-tuned\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286121 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0892e381-08bb-4454-99af-9dd414b35525-kubelet-config\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-ovn\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286166 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysctl-conf\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0892e381-08bb-4454-99af-9dd414b35525-dbus\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-cni-bin\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-var-lib-kubelet\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-slash\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.286865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-node-log\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-env-overrides\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.286327 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-ovnkube-script-lib\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysctl-d\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjb9\" (UniqueName: \"kubernetes.io/projected/012eecca-9f9b-4a13-8adc-05b585fd794b-kube-api-access-pzjb9\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.286390 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:12.786371903 +0000 UTC m=+3.155654357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286417 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9568559-bb27-4f69-aa6e-e169bbfd3048-host-slash\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrphc\" (UniqueName: \"kubernetes.io/projected/63b14197-55a5-4407-8c24-397ab7006750-kube-api-access-qrphc\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39650afc-ce1c-4648-83c6-5b4969c0db6a-tmp\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-device-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-systemd\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.287624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.286584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/63b14197-55a5-4407-8c24-397ab7006750-ovn-node-metrics-cert\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.287948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-run\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-ovnkube-config\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-systemd\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lwnq\" (UniqueName: \"kubernetes.io/projected/7822caac-b450-425a-bec0-981f5e05c867-kube-api-access-5lwnq\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9568559-bb27-4f69-aa6e-e169bbfd3048-iptables-alerter-script\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-lib-modules\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlcz\" (UniqueName: \"kubernetes.io/projected/39650afc-ce1c-4648-83c6-5b4969c0db6a-kube-api-access-nmlcz\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e9568559-bb27-4f69-aa6e-e169bbfd3048-host-slash\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0892e381-08bb-4454-99af-9dd414b35525-kubelet-config\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-run\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-var-lib-kubelet\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.288802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288446 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-ovn\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.289548 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.289548 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-env-overrides\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.289548 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288971 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-run-systemd\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.289548 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.289020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-cni-bin\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.289548 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.289148 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-systemd\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.289789 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.289616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-ovnkube-config\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.289789 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.289656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-device-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.289789 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.289750 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-lib-modules\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.289789 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.289754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-node-log\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.289970 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.289799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysctl-d\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.289970 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.289869 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:12.289970 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.289926 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret podName:0892e381-08bb-4454-99af-9dd414b35525 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:12.789911363 +0000 UTC m=+3.159193824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret") pod "global-pull-secret-syncer-kjjnh" (UID: "0892e381-08bb-4454-99af-9dd414b35525") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:12.290117 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.288205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63b14197-55a5-4407-8c24-397ab7006750-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.290117 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.290078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysconfig\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.290302 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.290115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-socket-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.290302 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.290267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-tuned\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.290302 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.290282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7822caac-b450-425a-bec0-981f5e05c867-socket-dir\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.290436 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.290341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/39650afc-ce1c-4648-83c6-5b4969c0db6a-etc-sysconfig\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.290487 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.290472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/63b14197-55a5-4407-8c24-397ab7006750-ovnkube-script-lib\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.292064 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.291378 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/63b14197-55a5-4407-8c24-397ab7006750-ovn-node-metrics-cert\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.292308 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.292248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e9568559-bb27-4f69-aa6e-e169bbfd3048-iptables-alerter-script\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.292608 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.292591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/39650afc-ce1c-4648-83c6-5b4969c0db6a-tmp\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.294176 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.294154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c5s9\" (UniqueName: \"kubernetes.io/projected/e9568559-bb27-4f69-aa6e-e169bbfd3048-kube-api-access-4c5s9\") pod \"iptables-alerter-cpd95\" (UID: \"e9568559-bb27-4f69-aa6e-e169bbfd3048\") " pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.294349 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.294332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjb9\" (UniqueName: \"kubernetes.io/projected/012eecca-9f9b-4a13-8adc-05b585fd794b-kube-api-access-pzjb9\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:12.303453 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.303426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlcz\" (UniqueName: \"kubernetes.io/projected/39650afc-ce1c-4648-83c6-5b4969c0db6a-kube-api-access-nmlcz\") pod \"tuned-b64tn\" (UID: \"39650afc-ce1c-4648-83c6-5b4969c0db6a\") " pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.303685 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.303669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lwnq\" (UniqueName: \"kubernetes.io/projected/7822caac-b450-425a-bec0-981f5e05c867-kube-api-access-5lwnq\") pod \"aws-ebs-csi-driver-node-xzqsf\" (UID: \"7822caac-b450-425a-bec0-981f5e05c867\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.304450 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.304431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrphc\" (UniqueName: \"kubernetes.io/projected/63b14197-55a5-4407-8c24-397ab7006750-kube-api-access-qrphc\") pod \"ovnkube-node-75cxs\" (UID: \"63b14197-55a5-4407-8c24-397ab7006750\") " pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.360120 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.360065 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:12.381235 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.381197 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sg7qn" Apr 16 14:52:12.387876 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.387859 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pnmnk" Apr 16 14:52:12.393443 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.393428 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4vrpg" Apr 16 14:52:12.398954 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.398934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:12.403473 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.403454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" Apr 16 14:52:12.409966 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.409949 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cpd95" Apr 16 14:52:12.415519 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.415498 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:12.421065 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.421046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-b64tn" Apr 16 14:52:12.425585 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.425569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" Apr 16 14:52:12.625347 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.625296 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ad67c5_7bb1_4c6c_8c58_869beff80d7f.slice/crio-a7c5ebe35dcf88891abbfb6ae1ab7ae837447c76274c05a874be0f213fddceea WatchSource:0}: Error finding container a7c5ebe35dcf88891abbfb6ae1ab7ae837447c76274c05a874be0f213fddceea: Status 404 returned error can't find the container with id a7c5ebe35dcf88891abbfb6ae1ab7ae837447c76274c05a874be0f213fddceea Apr 16 14:52:12.626317 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.626222 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9568559_bb27_4f69_aa6e_e169bbfd3048.slice/crio-cda1d670fa38d9310bcd5c646485c8be43b514c318d5829c4c737203a3640ac7 WatchSource:0}: Error finding container cda1d670fa38d9310bcd5c646485c8be43b514c318d5829c4c737203a3640ac7: Status 404 returned error can't find the container with id cda1d670fa38d9310bcd5c646485c8be43b514c318d5829c4c737203a3640ac7 Apr 16 14:52:12.627557 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.627416 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38c4c3ee_509f_467b_b050_9b723bfce014.slice/crio-a4e897e8f99c584f7eb46eb66fe9d364b2216a25e5eb96cd18b88a0a8ec9d24f WatchSource:0}: Error finding container a4e897e8f99c584f7eb46eb66fe9d364b2216a25e5eb96cd18b88a0a8ec9d24f: Status 404 returned error can't find the container with id a4e897e8f99c584f7eb46eb66fe9d364b2216a25e5eb96cd18b88a0a8ec9d24f Apr 16 14:52:12.630394 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.630376 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74f9607d_cf87_4aa2_af48_8f1cbac463ed.slice/crio-637cfc79904fa909cef95d01020d8cb209b817d4aa7c420e49d9e11a042c79d1 WatchSource:0}: Error finding container 637cfc79904fa909cef95d01020d8cb209b817d4aa7c420e49d9e11a042c79d1: Status 404 returned error can't find the container with id 637cfc79904fa909cef95d01020d8cb209b817d4aa7c420e49d9e11a042c79d1 Apr 16 14:52:12.631930 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.631908 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0f509b4_277a_43cc_ab72_a486e31674af.slice/crio-eceb2dc0e57f350434a29b684d28c4d2c5ae786a868ae1c64db22245119e125b WatchSource:0}: Error finding container eceb2dc0e57f350434a29b684d28c4d2c5ae786a868ae1c64db22245119e125b: Status 404 returned error can't find the container with id eceb2dc0e57f350434a29b684d28c4d2c5ae786a868ae1c64db22245119e125b Apr 16 14:52:12.632682 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.632656 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39650afc_ce1c_4648_83c6_5b4969c0db6a.slice/crio-70d812971eeee9a77c2fd1af6945f149db520972d65208a0bb9022b5f0bed2b3 WatchSource:0}: Error finding container 70d812971eeee9a77c2fd1af6945f149db520972d65208a0bb9022b5f0bed2b3: Status 404 returned error can't find the container with id 70d812971eeee9a77c2fd1af6945f149db520972d65208a0bb9022b5f0bed2b3 Apr 16 14:52:12.633743 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.633724 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7822caac_b450_425a_bec0_981f5e05c867.slice/crio-9102a880191d3ba3e0b907f2df6b18f756b38788759330f336ecdadd834fe0cb WatchSource:0}: Error finding container 9102a880191d3ba3e0b907f2df6b18f756b38788759330f336ecdadd834fe0cb: Status 404 returned error can't find the container with id 9102a880191d3ba3e0b907f2df6b18f756b38788759330f336ecdadd834fe0cb Apr 16 14:52:12.635253 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.635231 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63b14197_55a5_4407_8c24_397ab7006750.slice/crio-a061f68f1e6907409815f140d8151a171917595f30e76ba0c1c246871d1632df WatchSource:0}: Error finding container a061f68f1e6907409815f140d8151a171917595f30e76ba0c1c246871d1632df: Status 404 returned error can't find the container with id a061f68f1e6907409815f140d8151a171917595f30e76ba0c1c246871d1632df Apr 16 14:52:12.636316 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:12.636294 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod497451ed_b0c8_4109_a9ab_9689ebba5dba.slice/crio-ba06374c65e7bdd0e62f1740b202af6da88ec9090fdfb7f519b2eb6d0c8e86ec WatchSource:0}: Error finding container ba06374c65e7bdd0e62f1740b202af6da88ec9090fdfb7f519b2eb6d0c8e86ec: Status 404 returned error can't find the container with id ba06374c65e7bdd0e62f1740b202af6da88ec9090fdfb7f519b2eb6d0c8e86ec Apr 16 14:52:12.692747 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.692633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:12.692813 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.692770 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:12.692813 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.692785 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:12.692813 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.692794 2575 projected.go:194] Error preparing data for projected volume kube-api-access-qqc9z for pod openshift-network-diagnostics/network-check-target-85ccl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:12.692964 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.692837 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z podName:4a74642f-0b47-4e56-931c-041808066f04 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:13.692824066 +0000 UTC m=+4.062106516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qqc9z" (UniqueName: "kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z") pod "network-check-target-85ccl" (UID: "4a74642f-0b47-4e56-931c-041808066f04") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:12.793140 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.793117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:12.793328 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:12.793159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:12.793328 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.793268 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:12.793328 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.793292 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:12.793328 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.793324 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:13.793310826 +0000 UTC m=+4.162593279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:12.793526 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:12.793338 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret podName:0892e381-08bb-4454-99af-9dd414b35525 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:13.79333218 +0000 UTC m=+4.162614630 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret") pod "global-pull-secret-syncer-kjjnh" (UID: "0892e381-08bb-4454-99af-9dd414b35525") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:13.110396 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.110265 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:11 +0000 UTC" deadline="2028-01-23 05:05:33.281366832 +0000 UTC" Apr 16 14:52:13.110396 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.110298 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15518h13m20.171072223s" Apr 16 14:52:13.201557 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.200562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:13.201557 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.200670 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:13.214284 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.213876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cfqtw" event={"ID":"497451ed-b0c8-4109-a9ab-9689ebba5dba","Type":"ContainerStarted","Data":"ba06374c65e7bdd0e62f1740b202af6da88ec9090fdfb7f519b2eb6d0c8e86ec"} Apr 16 14:52:13.218658 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.218610 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" event={"ID":"7822caac-b450-425a-bec0-981f5e05c867","Type":"ContainerStarted","Data":"9102a880191d3ba3e0b907f2df6b18f756b38788759330f336ecdadd834fe0cb"} Apr 16 14:52:13.219722 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.219664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerStarted","Data":"eceb2dc0e57f350434a29b684d28c4d2c5ae786a868ae1c64db22245119e125b"} Apr 16 14:52:13.222125 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.222095 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sg7qn" event={"ID":"74f9607d-cf87-4aa2-af48-8f1cbac463ed","Type":"ContainerStarted","Data":"637cfc79904fa909cef95d01020d8cb209b817d4aa7c420e49d9e11a042c79d1"} Apr 16 14:52:13.223637 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.223611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pnmnk" event={"ID":"38c4c3ee-509f-467b-b050-9b723bfce014","Type":"ContainerStarted","Data":"a4e897e8f99c584f7eb46eb66fe9d364b2216a25e5eb96cd18b88a0a8ec9d24f"} Apr 16 14:52:13.224964 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.224920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cpd95" event={"ID":"e9568559-bb27-4f69-aa6e-e169bbfd3048","Type":"ContainerStarted","Data":"cda1d670fa38d9310bcd5c646485c8be43b514c318d5829c4c737203a3640ac7"} Apr 16 14:52:13.227589 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.227561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" event={"ID":"e289990daac467a09b205e75b6286d5f","Type":"ContainerStarted","Data":"dd6467dd14c73364059c07dcb841e1934c1a2d91f05d3075b752879b70e8f281"} Apr 16 14:52:13.230226 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.230191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"a061f68f1e6907409815f140d8151a171917595f30e76ba0c1c246871d1632df"} Apr 16 14:52:13.232280 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.232232 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b64tn" event={"ID":"39650afc-ce1c-4648-83c6-5b4969c0db6a","Type":"ContainerStarted","Data":"70d812971eeee9a77c2fd1af6945f149db520972d65208a0bb9022b5f0bed2b3"} Apr 16 14:52:13.234577 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.234532 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4vrpg" event={"ID":"55ad67c5-7bb1-4c6c-8c58-869beff80d7f","Type":"ContainerStarted","Data":"a7c5ebe35dcf88891abbfb6ae1ab7ae837447c76274c05a874be0f213fddceea"} Apr 16 14:52:13.243544 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.242761 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-229.ec2.internal" podStartSLOduration=2.242748873 podStartE2EDuration="2.242748873s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:13.242343072 +0000 UTC m=+3.611625555" watchObservedRunningTime="2026-04-16 14:52:13.242748873 +0000 UTC m=+3.612031347" Apr 16 14:52:13.700273 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.700184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:13.700431 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.700345 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:13.700431 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.700363 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:13.700431 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.700377 2575 projected.go:194] Error preparing data for projected volume kube-api-access-qqc9z for pod openshift-network-diagnostics/network-check-target-85ccl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:13.700431 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.700431 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z podName:4a74642f-0b47-4e56-931c-041808066f04 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:15.7004138 +0000 UTC m=+6.069696257 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qqc9z" (UniqueName: "kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z") pod "network-check-target-85ccl" (UID: "4a74642f-0b47-4e56-931c-041808066f04") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:13.801798 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.801506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:13.801798 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:13.801563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:13.801798 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.801699 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:13.801798 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.801712 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:13.801798 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.801774 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:15.801757135 +0000 UTC m=+6.171039585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:13.801798 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:13.801791 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret podName:0892e381-08bb-4454-99af-9dd414b35525 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:15.80178396 +0000 UTC m=+6.171066410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret") pod "global-pull-secret-syncer-kjjnh" (UID: "0892e381-08bb-4454-99af-9dd414b35525") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:14.203598 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:14.200664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:14.203598 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:14.200793 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:14.203598 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:14.203427 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:14.203598 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:14.203525 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:14.249141 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:14.247983 2575 generic.go:358] "Generic (PLEG): container finished" podID="10023b7ca0aac89d793122436ad8f10e" containerID="74df7cdc1057fcd6ff39ca7f522de543baadc2b75fb35f7a6965201a41fb715d" exitCode=0 Apr 16 14:52:14.249141 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:14.248861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" event={"ID":"10023b7ca0aac89d793122436ad8f10e","Type":"ContainerDied","Data":"74df7cdc1057fcd6ff39ca7f522de543baadc2b75fb35f7a6965201a41fb715d"} Apr 16 14:52:15.201124 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:15.201089 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:15.201311 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.201236 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:15.254820 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:15.254786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" event={"ID":"10023b7ca0aac89d793122436ad8f10e","Type":"ContainerStarted","Data":"02041a0d859e09d3f40688558868f4949ba6100818ee1b052c6638a72eec20b8"} Apr 16 14:52:15.719140 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:15.719104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:15.719338 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.719287 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:15.719338 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.719308 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:15.719338 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.719321 2575 projected.go:194] Error preparing data for projected volume kube-api-access-qqc9z for pod openshift-network-diagnostics/network-check-target-85ccl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:15.719492 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.719381 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z podName:4a74642f-0b47-4e56-931c-041808066f04 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:19.719362568 +0000 UTC m=+10.088645030 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qqc9z" (UniqueName: "kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z") pod "network-check-target-85ccl" (UID: "4a74642f-0b47-4e56-931c-041808066f04") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:15.820714 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:15.820671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:15.820886 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:15.820765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:15.820886 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.820824 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:15.820886 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.820869 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:15.821031 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.820893 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret podName:0892e381-08bb-4454-99af-9dd414b35525 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:19.820873301 +0000 UTC m=+10.190155756 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret") pod "global-pull-secret-syncer-kjjnh" (UID: "0892e381-08bb-4454-99af-9dd414b35525") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:15.821031 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:15.820917 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:19.820902836 +0000 UTC m=+10.190185299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:16.201584 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:16.200820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:16.201584 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:16.200980 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:16.201584 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:16.201408 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:16.201584 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:16.201493 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:17.201144 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:17.201107 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:17.201595 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:17.201262 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:18.201237 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:18.200796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:18.201237 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:18.200924 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:18.201676 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:18.200796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:18.201676 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:18.201363 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:19.200841 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:19.200781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:19.201012 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.200900 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:19.753343 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:19.753308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:19.753747 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.753467 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:19.753747 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.753488 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:19.753747 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.753501 2575 projected.go:194] Error preparing data for projected volume kube-api-access-qqc9z for pod openshift-network-diagnostics/network-check-target-85ccl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:19.753747 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.753558 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z podName:4a74642f-0b47-4e56-931c-041808066f04 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:27.753540314 +0000 UTC m=+18.122822768 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qqc9z" (UniqueName: "kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z") pod "network-check-target-85ccl" (UID: "4a74642f-0b47-4e56-931c-041808066f04") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:19.854489 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:19.854454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:19.854637 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:19.854514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:19.854702 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.854658 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:19.854771 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.854713 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret podName:0892e381-08bb-4454-99af-9dd414b35525 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:27.854696296 +0000 UTC m=+18.223978749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret") pod "global-pull-secret-syncer-kjjnh" (UID: "0892e381-08bb-4454-99af-9dd414b35525") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:19.855077 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.855062 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:19.855115 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:19.855102 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:27.855091536 +0000 UTC m=+18.224373986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:20.202172 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:20.201671 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:20.202172 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:20.201781 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:20.202172 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:20.201892 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:20.202172 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:20.201985 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:21.200834 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:21.200800 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:21.201298 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:21.200914 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:22.200907 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:22.200824 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:22.201380 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:22.200980 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:22.201380 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:22.201035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:22.201380 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:22.201153 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:23.201357 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:23.201332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:23.201734 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:23.201433 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:24.201060 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:24.201030 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:24.201237 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:24.201150 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:24.201237 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:24.201223 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:24.201335 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:24.201309 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:25.200340 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:25.200310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:25.200758 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:25.200426 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:26.201306 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:26.201278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:26.201724 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:26.201392 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:26.201724 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:26.201468 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:26.201724 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:26.201583 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:27.201366 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:27.201338 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:27.201911 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.201438 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:27.820194 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:27.820162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:27.820417 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.820296 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:27.820417 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.820311 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:27.820417 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.820319 2575 projected.go:194] Error preparing data for projected volume kube-api-access-qqc9z for pod openshift-network-diagnostics/network-check-target-85ccl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:27.820417 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.820363 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z podName:4a74642f-0b47-4e56-931c-041808066f04 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:43.820351065 +0000 UTC m=+34.189633519 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qqc9z" (UniqueName: "kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z") pod "network-check-target-85ccl" (UID: "4a74642f-0b47-4e56-931c-041808066f04") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:27.921032 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:27.920990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:27.921032 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:27.921039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:27.921279 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.921140 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:27.921279 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.921141 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:27.921279 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.921199 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret podName:0892e381-08bb-4454-99af-9dd414b35525 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:43.921183983 +0000 UTC m=+34.290466453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret") pod "global-pull-secret-syncer-kjjnh" (UID: "0892e381-08bb-4454-99af-9dd414b35525") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:52:27.921279 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:27.921256 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:43.921235732 +0000 UTC m=+34.290518193 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:28.200973 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:28.200936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:28.200973 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:28.200971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:28.201264 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:28.201072 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:28.201264 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:28.201201 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:29.201038 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.200855 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:29.201400 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:29.201139 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:29.277379 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.277348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-b64tn" event={"ID":"39650afc-ce1c-4648-83c6-5b4969c0db6a","Type":"ContainerStarted","Data":"77991eb672ff907e0ac32e5b8dca425e3583f003b44e9bb8c078c4ef6ba89783"} Apr 16 14:52:29.278726 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.278696 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cfqtw" event={"ID":"497451ed-b0c8-4109-a9ab-9689ebba5dba","Type":"ContainerStarted","Data":"cd8abab7596571d159f4f03285656807d693c9c645bd72812248059fe01c4b01"} Apr 16 14:52:29.279708 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.279685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" event={"ID":"7822caac-b450-425a-bec0-981f5e05c867","Type":"ContainerStarted","Data":"97592df0227b388d9cdec4d94cae2d9e866991114393912fa02bbb12c184fa69"} Apr 16 14:52:29.281888 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.281513 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerStarted","Data":"e5da88a42efb3e0e442c7d6960e522908dc0793e9f1fb128ad4eddcd081c8b64"} Apr 16 14:52:29.285695 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.285661 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sg7qn" event={"ID":"74f9607d-cf87-4aa2-af48-8f1cbac463ed","Type":"ContainerStarted","Data":"4eb54df16865e7da4b242cba5c0f9a1765cc241b94a6ebea4eb416df8baca1ef"} Apr 16 14:52:29.286743 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.286723 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pnmnk" event={"ID":"38c4c3ee-509f-467b-b050-9b723bfce014","Type":"ContainerStarted","Data":"9624d206768a5197e5a993b7ebbd5d32f9d43a705a81ba2522aed5568cc501ea"} Apr 16 14:52:29.291499 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.291289 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-b64tn" podStartSLOduration=3.04466344 podStartE2EDuration="19.291280076s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.635439398 +0000 UTC m=+3.004721855" lastFinishedPulling="2026-04-16 14:52:28.882056025 +0000 UTC m=+19.251338491" observedRunningTime="2026-04-16 14:52:29.291030363 +0000 UTC m=+19.660312835" watchObservedRunningTime="2026-04-16 14:52:29.291280076 +0000 UTC m=+19.660562548" Apr 16 14:52:29.291499 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.291420 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-229.ec2.internal" podStartSLOduration=18.291416655 podStartE2EDuration="18.291416655s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:15.269673669 +0000 UTC m=+5.638956142" watchObservedRunningTime="2026-04-16 14:52:29.291416655 +0000 UTC m=+19.660699127" Apr 16 14:52:29.303464 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.303429 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sg7qn" podStartSLOduration=3.021193565 podStartE2EDuration="19.303418884s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.633591566 +0000 UTC m=+3.002874016" lastFinishedPulling="2026-04-16 14:52:28.915816881 +0000 UTC m=+19.285099335" observedRunningTime="2026-04-16 14:52:29.303366564 +0000 UTC m=+19.672649047" watchObservedRunningTime="2026-04-16 14:52:29.303418884 +0000 UTC m=+19.672701355" Apr 16 14:52:29.335011 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.334975 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pnmnk" podStartSLOduration=3.083708005 podStartE2EDuration="19.33496484s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.629158509 +0000 UTC m=+2.998440969" lastFinishedPulling="2026-04-16 14:52:28.88041533 +0000 UTC m=+19.249697804" observedRunningTime="2026-04-16 14:52:29.320968652 +0000 UTC m=+19.690251135" watchObservedRunningTime="2026-04-16 14:52:29.33496484 +0000 UTC m=+19.704247294" Apr 16 14:52:29.335272 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:29.335241 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cfqtw" podStartSLOduration=3.092591498 podStartE2EDuration="19.335233869s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.637878623 +0000 UTC m=+3.007161073" lastFinishedPulling="2026-04-16 14:52:28.880520981 +0000 UTC m=+19.249803444" observedRunningTime="2026-04-16 14:52:29.334961949 +0000 UTC m=+19.704244420" watchObservedRunningTime="2026-04-16 14:52:29.335233869 +0000 UTC m=+19.704516343" Apr 16 14:52:30.189914 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.189737 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:52:30.201732 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.201712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:30.202341 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:30.201785 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:30.202341 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.201819 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:30.202341 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:30.201861 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:30.290083 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.290055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" event={"ID":"7822caac-b450-425a-bec0-981f5e05c867","Type":"ContainerStarted","Data":"874354935b6cbe774db522bf8a297fe5bbe966d8c81026edab1dbba6a03360b4"} Apr 16 14:52:30.291273 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.291241 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0f509b4-277a-43cc-ab72-a486e31674af" containerID="e5da88a42efb3e0e442c7d6960e522908dc0793e9f1fb128ad4eddcd081c8b64" exitCode=0 Apr 16 14:52:30.291357 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.291304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerDied","Data":"e5da88a42efb3e0e442c7d6960e522908dc0793e9f1fb128ad4eddcd081c8b64"} Apr 16 14:52:30.292466 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.292408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cpd95" event={"ID":"e9568559-bb27-4f69-aa6e-e169bbfd3048","Type":"ContainerStarted","Data":"6daaa8787a3b0833b465079caa4987a9d192e62ab144544c972654cbb3ddb9b3"} Apr 16 14:52:30.294835 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.294819 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 14:52:30.295097 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.295083 2575 generic.go:358] "Generic (PLEG): container finished" podID="63b14197-55a5-4407-8c24-397ab7006750" containerID="6e3a83ff5d01ef51cf2aae24228aa6ad26b4ae150fb1c1e5f74f35fc7e468a97" exitCode=1 Apr 16 14:52:30.295147 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.295133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"8b718fb4e83369309a58077510d3b2f8a09be4fedee027b976ab5fe08489d04d"} Apr 16 14:52:30.295177 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.295151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"be152fa76c7372927a91027893a10c01d05fc4f6053e91860c4d36ba73d426e5"} Apr 16 14:52:30.295177 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.295164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"2d80c0d33dc3e96f55f1d313b28863001cfe2d66e70e8ace1d36299f2adc0f6c"} Apr 16 14:52:30.295177 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.295173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"caa62929bd50be092faae9fc7bd4358f582519352f1138d801fcdbe0fd1fe327"} Apr 16 14:52:30.295319 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.295184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerDied","Data":"6e3a83ff5d01ef51cf2aae24228aa6ad26b4ae150fb1c1e5f74f35fc7e468a97"} Apr 16 14:52:30.295319 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.295197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"fe56b6b2a77e5fef5ece26af7ee2437a1b57f1871ff3efcb1ec1367d0955481a"} Apr 16 14:52:30.296338 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.296311 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4vrpg" event={"ID":"55ad67c5-7bb1-4c6c-8c58-869beff80d7f","Type":"ContainerStarted","Data":"eccdae6f35e7f9b039ffa1c946b9671b3bde4d4c4724a4b56da3778dd8c00852"} Apr 16 14:52:30.323814 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.323778 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4vrpg" podStartSLOduration=4.069954886 podStartE2EDuration="20.323767321s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.626736778 +0000 UTC m=+2.996019229" lastFinishedPulling="2026-04-16 14:52:28.880549199 +0000 UTC m=+19.249831664" observedRunningTime="2026-04-16 14:52:30.3233819 +0000 UTC m=+20.692664372" watchObservedRunningTime="2026-04-16 14:52:30.323767321 +0000 UTC m=+20.693049792" Apr 16 14:52:30.335185 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:30.335154 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cpd95" podStartSLOduration=4.08302623 podStartE2EDuration="20.335143959s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.628300396 +0000 UTC m=+2.997582859" lastFinishedPulling="2026-04-16 14:52:28.880418127 +0000 UTC m=+19.249700588" observedRunningTime="2026-04-16 14:52:30.334864654 +0000 UTC m=+20.704147126" watchObservedRunningTime="2026-04-16 14:52:30.335143959 +0000 UTC m=+20.704426431" Apr 16 14:52:31.143364 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:31.143272 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:52:30.189886584Z","UUID":"0a8349c7-7390-41e0-b0d3-aff39e45783f","Handler":null,"Name":"","Endpoint":""} Apr 16 14:52:31.144981 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:31.144952 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:52:31.145107 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:31.145007 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:52:31.201456 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:31.201426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:31.201590 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:31.201547 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:32.201534 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:32.201365 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:32.201959 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:32.201370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:32.201959 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:32.201631 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:32.201959 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:32.201730 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:32.302805 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:32.302771 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" event={"ID":"7822caac-b450-425a-bec0-981f5e05c867","Type":"ContainerStarted","Data":"9943184acb4c3f3982bd49c470d28bae908da9fcefec49660c6dbfaa99ff1d47"} Apr 16 14:52:32.305660 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:32.305641 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 14:52:32.305967 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:32.305939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"d976557e57b38d187eb9557a1a429a1e845a809749d35eef6d2f3a9cc99492d5"} Apr 16 14:52:32.317907 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:32.317874 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xzqsf" podStartSLOduration=3.704465255 podStartE2EDuration="22.31786281s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.637267593 +0000 UTC m=+3.006550044" lastFinishedPulling="2026-04-16 14:52:31.250665149 +0000 UTC m=+21.619947599" observedRunningTime="2026-04-16 14:52:32.316252293 +0000 UTC m=+22.685534766" watchObservedRunningTime="2026-04-16 14:52:32.31786281 +0000 UTC m=+22.687145281" Apr 16 14:52:33.200374 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:33.200348 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:33.200540 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:33.200438 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:34.184519 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.183453 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:34.186721 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.184719 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:34.200400 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.200384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:34.200790 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.200414 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:34.200790 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:34.200477 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:34.200790 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:34.200597 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:34.311371 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.311150 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerStarted","Data":"1d6f93c9c570287f1185a3f26665cb7df1dbf000decc590fb16b64e5efecc8dd"} Apr 16 14:52:34.316631 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.316612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 14:52:34.316954 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.316932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"d284d09d0da5077fe0ebd74d3c9b0a8a8944bd2ab360a3035d2d705aa5a6c960"} Apr 16 14:52:34.317169 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.317149 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:34.317277 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.317179 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:34.317277 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.317192 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:34.317277 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.317203 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:34.317416 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.317330 2575 scope.go:117] "RemoveContainer" containerID="6e3a83ff5d01ef51cf2aae24228aa6ad26b4ae150fb1c1e5f74f35fc7e468a97" Apr 16 14:52:34.317762 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.317745 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cfqtw" Apr 16 14:52:34.331184 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.331163 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:34.331295 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:34.331281 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:52:35.201248 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:35.201223 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:35.201581 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:35.201305 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:35.321945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:35.321908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 14:52:35.322352 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:35.322305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" event={"ID":"63b14197-55a5-4407-8c24-397ab7006750","Type":"ContainerStarted","Data":"038e35916505ec91abd02f84f5dce3e683d5aaaee4685cb7defb866f035e2212"} Apr 16 14:52:35.324080 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:35.324052 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0f509b4-277a-43cc-ab72-a486e31674af" containerID="1d6f93c9c570287f1185a3f26665cb7df1dbf000decc590fb16b64e5efecc8dd" exitCode=0 Apr 16 14:52:35.324176 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:35.324135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerDied","Data":"1d6f93c9c570287f1185a3f26665cb7df1dbf000decc590fb16b64e5efecc8dd"} Apr 16 14:52:35.353112 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:35.353049 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" podStartSLOduration=8.7523354 podStartE2EDuration="25.353028983s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.636845959 +0000 UTC m=+3.006128422" lastFinishedPulling="2026-04-16 14:52:29.237539554 +0000 UTC m=+19.606822005" observedRunningTime="2026-04-16 14:52:35.347448207 +0000 UTC m=+25.716730679" watchObservedRunningTime="2026-04-16 14:52:35.353028983 +0000 UTC m=+25.722311456" Apr 16 14:52:36.033147 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:36.033120 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kjjnh"] Apr 16 14:52:36.033283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:36.033274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:36.033398 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:36.033377 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:36.036008 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:36.035986 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qf2x7"] Apr 16 14:52:36.036121 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:36.036108 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:36.036350 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:36.036273 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:36.036549 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:36.036527 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-85ccl"] Apr 16 14:52:36.036660 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:36.036611 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:36.036713 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:36.036692 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:36.327788 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:36.327710 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0f509b4-277a-43cc-ab72-a486e31674af" containerID="93a030896c989b5d8f2f952532ccc7ea2cbcf3d18ca615bc5a8e67b65e237932" exitCode=0 Apr 16 14:52:36.327788 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:36.327756 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerDied","Data":"93a030896c989b5d8f2f952532ccc7ea2cbcf3d18ca615bc5a8e67b65e237932"} Apr 16 14:52:37.201200 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:37.201136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:37.201321 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:37.201143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:37.201321 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:37.201251 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:37.201321 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:37.201312 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:37.331727 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:37.331700 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0f509b4-277a-43cc-ab72-a486e31674af" containerID="cc873a385402054b40ddc3cd184904bacee7819d71a58e403aa55847a1ae5337" exitCode=0 Apr 16 14:52:37.332044 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:37.331735 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerDied","Data":"cc873a385402054b40ddc3cd184904bacee7819d71a58e403aa55847a1ae5337"} Apr 16 14:52:38.200576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:38.200552 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:38.200725 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:38.200643 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:39.201127 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:39.201096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:39.201579 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:39.201097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:39.201579 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:39.201237 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:39.201579 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:39.201286 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:40.201982 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:40.201911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:40.202428 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:40.202016 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85ccl" podUID="4a74642f-0b47-4e56-931c-041808066f04" Apr 16 14:52:41.201391 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:41.201359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:41.201548 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:41.201358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:41.201548 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:41.201484 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kjjnh" podUID="0892e381-08bb-4454-99af-9dd414b35525" Apr 16 14:52:41.201658 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:41.201563 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:52:42.005622 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.005537 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-229.ec2.internal" event="NodeReady" Apr 16 14:52:42.006090 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.005675 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:52:42.060097 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.060064 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd"] Apr 16 14:52:42.065044 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.065021 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w"] Apr 16 14:52:42.065184 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.065123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" Apr 16 14:52:42.068267 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.068242 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 14:52:42.068442 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.068312 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 14:52:42.068442 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.068331 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 14:52:42.068585 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.068517 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2"] Apr 16 14:52:42.068645 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.068630 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-kpbq7\"" Apr 16 14:52:42.068754 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.068713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.069620 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.069601 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 14:52:42.071154 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.071133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 14:52:42.071768 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.071749 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h"] Apr 16 14:52:42.071948 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.071894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:42.074354 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.074338 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-m2hzf\"" Apr 16 14:52:42.074477 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.074464 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 14:52:42.074803 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.074789 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 14:52:42.076029 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.075363 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d5bf687d7-xq82d"] Apr 16 14:52:42.076029 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.075501 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.079127 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.078820 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 14:52:42.079294 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.079277 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 14:52:42.079585 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.079552 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 14:52:42.080151 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.080123 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd"] Apr 16 14:52:42.080267 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.080150 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2"] Apr 16 14:52:42.080964 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.080945 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w"] Apr 16 14:52:42.081047 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.080991 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h"] Apr 16 14:52:42.081047 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.081039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.081243 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.081225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 14:52:42.081986 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.081969 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d5bf687d7-xq82d"] Apr 16 14:52:42.083836 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.083807 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:52:42.084122 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.084101 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:52:42.084204 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.084141 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-g2vw7\"" Apr 16 14:52:42.084281 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.084252 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:52:42.090071 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.089294 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m6qlb"] Apr 16 14:52:42.094343 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.093322 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:52:42.096283 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.095897 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cvpwl"] Apr 16 14:52:42.099430 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.099300 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:42.099570 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.099551 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.101085 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.101067 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cvpwl"] Apr 16 14:52:42.102012 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.101995 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:52:42.102309 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.102029 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m6qlb"] Apr 16 14:52:42.103390 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.103041 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:52:42.103390 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.103065 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:52:42.103390 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.103077 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:52:42.103390 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.103153 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9hwfz\"" Apr 16 14:52:42.103390 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.103246 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:52:42.103661 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.103564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cvt8w\"" Apr 16 14:52:42.201033 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.201009 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:42.204073 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.204053 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:52:42.204073 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.204063 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sk8kg\"" Apr 16 14:52:42.204073 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.204056 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:52:42.235223 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:42.235302 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b24bf\" (UniqueName: \"kubernetes.io/projected/5803f957-e9d1-4ccf-a732-54889272611a-kube-api-access-b24bf\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:42.235302 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-ca\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.235410 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-bound-sa-token\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.235410 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/43ed3899-5be3-4421-9337-d130a677964b-klusterlet-config\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.235410 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235368 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-image-registry-private-configuration\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.235410 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-installation-pull-secrets\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.235576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcgvg\" (UniqueName: \"kubernetes.io/projected/7badd0a7-a664-4046-8cb3-c1bf570dc29b-kube-api-access-jcgvg\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.235576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c1d77f5-df10-4280-99eb-47d03799e2f7-ca-trust-extracted\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.235576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a6e69161-e13e-41f0-88c7-8ce118b596e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6587d65555-ws6kd\" (UID: \"a6e69161-e13e-41f0-88c7-8ce118b596e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" Apr 16 14:52:42.235576 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-certificates\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.235722 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:42.235722 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.235722 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.235722 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfsxx\" (UniqueName: \"kubernetes.io/projected/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-kube-api-access-cfsxx\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.235722 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7badd0a7-a664-4046-8cb3-c1bf570dc29b-tmp-dir\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.235945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-trusted-ca\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.235945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43ed3899-5be3-4421-9337-d130a677964b-tmp\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.235945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-hub\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.235945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:42.235945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.236118 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.235962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7badd0a7-a664-4046-8cb3-c1bf570dc29b-config-volume\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.236118 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.236003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xj46\" (UniqueName: \"kubernetes.io/projected/43ed3899-5be3-4421-9337-d130a677964b-kube-api-access-9xj46\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.236118 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.236024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv6sw\" (UniqueName: \"kubernetes.io/projected/a6e69161-e13e-41f0-88c7-8ce118b596e5-kube-api-access-xv6sw\") pod \"managed-serviceaccount-addon-agent-6587d65555-ws6kd\" (UID: \"a6e69161-e13e-41f0-88c7-8ce118b596e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" Apr 16 14:52:42.236118 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.236041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phgzp\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-kube-api-access-phgzp\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.236118 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.236067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.236328 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.236117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.336861 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.336783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-installation-pull-secrets\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.336861 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.336827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcgvg\" (UniqueName: \"kubernetes.io/projected/7badd0a7-a664-4046-8cb3-c1bf570dc29b-kube-api-access-jcgvg\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.336861 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.336858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c1d77f5-df10-4280-99eb-47d03799e2f7-ca-trust-extracted\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.337098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.336882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a6e69161-e13e-41f0-88c7-8ce118b596e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6587d65555-ws6kd\" (UID: \"a6e69161-e13e-41f0-88c7-8ce118b596e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" Apr 16 14:52:42.337098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.336909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-certificates\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.337098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.336934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:42.337098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.336958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.337098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.336981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.337098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfsxx\" (UniqueName: \"kubernetes.io/projected/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-kube-api-access-cfsxx\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.337098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7badd0a7-a664-4046-8cb3-c1bf570dc29b-tmp-dir\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.337098 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-trusted-ca\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43ed3899-5be3-4421-9337-d130a677964b-tmp\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-hub\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7badd0a7-a664-4046-8cb3-c1bf570dc29b-config-volume\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xj46\" (UniqueName: \"kubernetes.io/projected/43ed3899-5be3-4421-9337-d130a677964b-kube-api-access-9xj46\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv6sw\" (UniqueName: \"kubernetes.io/projected/a6e69161-e13e-41f0-88c7-8ce118b596e5-kube-api-access-xv6sw\") pod \"managed-serviceaccount-addon-agent-6587d65555-ws6kd\" (UID: \"a6e69161-e13e-41f0-88c7-8ce118b596e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phgzp\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-kube-api-access-phgzp\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337372 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337461 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b24bf\" (UniqueName: \"kubernetes.io/projected/5803f957-e9d1-4ccf-a732-54889272611a-kube-api-access-b24bf\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:42.337501 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-ca\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.338070 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-bound-sa-token\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.338070 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/43ed3899-5be3-4421-9337-d130a677964b-klusterlet-config\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.338070 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-image-registry-private-configuration\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.338070 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.337633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c1d77f5-df10-4280-99eb-47d03799e2f7-ca-trust-extracted\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.338070 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.338052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-certificates\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.338325 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.338165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7badd0a7-a664-4046-8cb3-c1bf570dc29b-tmp-dir\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.338648 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.338623 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7badd0a7-a664-4046-8cb3-c1bf570dc29b-config-volume\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.338838 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.337292 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:42.338905 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.338845 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:52:42.338905 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.338894 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:42.838879572 +0000 UTC m=+33.208162042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:52:42.339062 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.338143 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:42.339108 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.339091 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:42.839078238 +0000 UTC m=+33.208360688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:52:42.339108 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.338204 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:42.339206 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.339129 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:42.839120523 +0000 UTC m=+33.208402974 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:52:42.339206 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.338383 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:52:42.339206 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.339165 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:42.839156749 +0000 UTC m=+33.208439199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:52:42.339429 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.339306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:42.339477 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.339431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/43ed3899-5be3-4421-9337-d130a677964b-tmp\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.339830 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.339782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.340324 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.340302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-trusted-ca\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.342789 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.342710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.342789 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.342742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-hub\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.342789 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.342752 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a6e69161-e13e-41f0-88c7-8ce118b596e5-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6587d65555-ws6kd\" (UID: \"a6e69161-e13e-41f0-88c7-8ce118b596e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" Apr 16 14:52:42.343150 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.343125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.343246 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.343166 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-installation-pull-secrets\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.343567 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.343546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/43ed3899-5be3-4421-9337-d130a677964b-klusterlet-config\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.344268 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.344246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-ca\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.344922 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.344892 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-image-registry-private-configuration\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.346407 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.346364 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcgvg\" (UniqueName: \"kubernetes.io/projected/7badd0a7-a664-4046-8cb3-c1bf570dc29b-kube-api-access-jcgvg\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.347086 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.347029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfsxx\" (UniqueName: \"kubernetes.io/projected/9d8ac45a-7bb3-4a8a-899a-dde2387958a7-kube-api-access-cfsxx\") pod \"cluster-proxy-proxy-agent-5466b455c-wlw8h\" (UID: \"9d8ac45a-7bb3-4a8a-899a-dde2387958a7\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.347086 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.347039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv6sw\" (UniqueName: \"kubernetes.io/projected/a6e69161-e13e-41f0-88c7-8ce118b596e5-kube-api-access-xv6sw\") pod \"managed-serviceaccount-addon-agent-6587d65555-ws6kd\" (UID: \"a6e69161-e13e-41f0-88c7-8ce118b596e5\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" Apr 16 14:52:42.348204 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.348160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xj46\" (UniqueName: \"kubernetes.io/projected/43ed3899-5be3-4421-9337-d130a677964b-kube-api-access-9xj46\") pod \"klusterlet-addon-workmgr-57f8b5cc7c-rbr8w\" (UID: \"43ed3899-5be3-4421-9337-d130a677964b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.348832 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.348814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b24bf\" (UniqueName: \"kubernetes.io/projected/5803f957-e9d1-4ccf-a732-54889272611a-kube-api-access-b24bf\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:42.349203 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.349177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-bound-sa-token\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.350267 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.350250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phgzp\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-kube-api-access-phgzp\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.387845 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.387815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" Apr 16 14:52:42.397580 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.397556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:42.414311 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.414288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:52:42.790266 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.790074 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd"] Apr 16 14:52:42.795856 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:42.795478 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6e69161_e13e_41f0_88c7_8ce118b596e5.slice/crio-01c47104635e67776a26ac2de755c031391f59ac3d456fd5196b21921315b3cf WatchSource:0}: Error finding container 01c47104635e67776a26ac2de755c031391f59ac3d456fd5196b21921315b3cf: Status 404 returned error can't find the container with id 01c47104635e67776a26ac2de755c031391f59ac3d456fd5196b21921315b3cf Apr 16 14:52:42.796425 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.796230 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h"] Apr 16 14:52:42.800290 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:42.800249 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d8ac45a_7bb3_4a8a_899a_dde2387958a7.slice/crio-659fb1418441d264b0bc44f853fdde1d59ae6758924f930909ff5791ea5636e1 WatchSource:0}: Error finding container 659fb1418441d264b0bc44f853fdde1d59ae6758924f930909ff5791ea5636e1: Status 404 returned error can't find the container with id 659fb1418441d264b0bc44f853fdde1d59ae6758924f930909ff5791ea5636e1 Apr 16 14:52:42.803110 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.803076 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w"] Apr 16 14:52:42.807741 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:42.807718 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ed3899_5be3_4421_9337_d130a677964b.slice/crio-c0396f27945de7d11e332de290a6733be73896698c8075622c4aee8c63a6b2f7 WatchSource:0}: Error finding container c0396f27945de7d11e332de290a6733be73896698c8075622c4aee8c63a6b2f7: Status 404 returned error can't find the container with id c0396f27945de7d11e332de290a6733be73896698c8075622c4aee8c63a6b2f7 Apr 16 14:52:42.841819 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.841797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:42.841895 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.841837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:42.841950 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.841912 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:52:42.841999 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.841957 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:43.841944267 +0000 UTC m=+34.211226717 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:52:42.841999 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.841914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:42.842100 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:42.842000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:42.842100 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.842035 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:42.842100 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.842058 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:52:42.842242 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.842112 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:43.842093452 +0000 UTC m=+34.211375906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:52:42.842242 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.842137 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:42.842242 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.842172 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:42.842242 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.842187 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:43.842173414 +0000 UTC m=+34.211455876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:52:42.842242 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:42.842205 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:43.842194666 +0000 UTC m=+34.211477118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:52:43.200851 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.200793 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:43.201449 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.200969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:43.203796 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.203779 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:52:43.203882 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.203844 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fb76v\"" Apr 16 14:52:43.203966 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.203880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:52:43.347382 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.347360 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0f509b4-277a-43cc-ab72-a486e31674af" containerID="853e0bdd0ebe5c1b387518e8ab42504dd6e639af80d403dcd52f036cca3fc6d1" exitCode=0 Apr 16 14:52:43.347483 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.347410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerDied","Data":"853e0bdd0ebe5c1b387518e8ab42504dd6e639af80d403dcd52f036cca3fc6d1"} Apr 16 14:52:43.348359 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.348343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" event={"ID":"43ed3899-5be3-4421-9337-d130a677964b","Type":"ContainerStarted","Data":"c0396f27945de7d11e332de290a6733be73896698c8075622c4aee8c63a6b2f7"} Apr 16 14:52:43.349179 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.349162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" event={"ID":"a6e69161-e13e-41f0-88c7-8ce118b596e5","Type":"ContainerStarted","Data":"01c47104635e67776a26ac2de755c031391f59ac3d456fd5196b21921315b3cf"} Apr 16 14:52:43.349988 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.349965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" event={"ID":"9d8ac45a-7bb3-4a8a-899a-dde2387958a7","Type":"ContainerStarted","Data":"659fb1418441d264b0bc44f853fdde1d59ae6758924f930909ff5791ea5636e1"} Apr 16 14:52:43.849506 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.849459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:43.849673 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.849523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:43.849673 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.849549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:43.849673 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.849619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:43.849673 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.849647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:43.849883 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.849766 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:43.849883 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.849828 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.849810194 +0000 UTC m=+36.219092656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:52:43.850811 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.850791 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:43.850811 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.850811 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:52:43.850975 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.850858 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.850842776 +0000 UTC m=+36.220125246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:52:43.850975 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.850915 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:52:43.850975 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.850948 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.850937329 +0000 UTC m=+36.220219792 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:52:43.851140 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.851001 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:43.851140 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.851032 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:45.85102058 +0000 UTC m=+36.220303043 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:52:43.858665 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.858641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqc9z\" (UniqueName: \"kubernetes.io/projected/4a74642f-0b47-4e56-931c-041808066f04-kube-api-access-qqc9z\") pod \"network-check-target-85ccl\" (UID: \"4a74642f-0b47-4e56-931c-041808066f04\") " pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:43.950585 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.950551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:52:43.950757 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.950679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:43.951922 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.951601 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:52:43.951922 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:43.951668 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:53:15.95164958 +0000 UTC m=+66.320932044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : secret "metrics-daemon-secret" not found Apr 16 14:52:43.964074 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:43.964047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0892e381-08bb-4454-99af-9dd414b35525-original-pull-secret\") pod \"global-pull-secret-syncer-kjjnh\" (UID: \"0892e381-08bb-4454-99af-9dd414b35525\") " pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:44.010781 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:44.010423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:44.114656 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:44.114412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kjjnh" Apr 16 14:52:44.180380 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:44.180318 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-85ccl"] Apr 16 14:52:44.186350 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:44.186320 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a74642f_0b47_4e56_931c_041808066f04.slice/crio-9878558e68b588671e61b7841b997e52ce4a68117f84227a51ad8549bda144ab WatchSource:0}: Error finding container 9878558e68b588671e61b7841b997e52ce4a68117f84227a51ad8549bda144ab: Status 404 returned error can't find the container with id 9878558e68b588671e61b7841b997e52ce4a68117f84227a51ad8549bda144ab Apr 16 14:52:44.292060 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:44.291996 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kjjnh"] Apr 16 14:52:44.296344 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:52:44.296316 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0892e381_08bb_4454_99af_9dd414b35525.slice/crio-36838afda4e8f26553fd9d4d15099fffe5877ffa00068e3da61101913371ebba WatchSource:0}: Error finding container 36838afda4e8f26553fd9d4d15099fffe5877ffa00068e3da61101913371ebba: Status 404 returned error can't find the container with id 36838afda4e8f26553fd9d4d15099fffe5877ffa00068e3da61101913371ebba Apr 16 14:52:44.375462 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:44.374499 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0f509b4-277a-43cc-ab72-a486e31674af" containerID="61a7ae1d2bee37aecdaeacd98c2c546768154295f1c3a1accd6ddd12679364ad" exitCode=0 Apr 16 14:52:44.375462 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:44.374572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerDied","Data":"61a7ae1d2bee37aecdaeacd98c2c546768154295f1c3a1accd6ddd12679364ad"} Apr 16 14:52:44.383047 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:44.382894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-85ccl" event={"ID":"4a74642f-0b47-4e56-931c-041808066f04","Type":"ContainerStarted","Data":"9878558e68b588671e61b7841b997e52ce4a68117f84227a51ad8549bda144ab"} Apr 16 14:52:44.388470 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:44.388381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kjjnh" event={"ID":"0892e381-08bb-4454-99af-9dd414b35525","Type":"ContainerStarted","Data":"36838afda4e8f26553fd9d4d15099fffe5877ffa00068e3da61101913371ebba"} Apr 16 14:52:45.397558 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:45.397306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" event={"ID":"e0f509b4-277a-43cc-ab72-a486e31674af","Type":"ContainerStarted","Data":"f32967fab7e101764644d3994754aaa63c77474dfdb65efd62d283f50345a801"} Apr 16 14:52:45.419345 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:45.419288 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9pkk7" podStartSLOduration=5.243477756 podStartE2EDuration="35.419271829s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:12.635986497 +0000 UTC m=+3.005268951" lastFinishedPulling="2026-04-16 14:52:42.81178057 +0000 UTC m=+33.181063024" observedRunningTime="2026-04-16 14:52:45.417725458 +0000 UTC m=+35.787007940" watchObservedRunningTime="2026-04-16 14:52:45.419271829 +0000 UTC m=+35.788554301" Apr 16 14:52:45.876241 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:45.875767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:45.876241 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:45.875894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:45.876241 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:45.875954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:45.876241 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:45.875986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:45.876241 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876013 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:45.876241 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876128 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:49.876106166 +0000 UTC m=+40.245388632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:52:45.876708 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876300 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:45.876708 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876317 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:52:45.876708 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876367 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:49.876351684 +0000 UTC m=+40.245634148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:52:45.876708 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876422 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:52:45.876708 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876447 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:45.876708 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876525 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:49.876508271 +0000 UTC m=+40.245790722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:52:45.876708 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:45.876547 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:49.876535415 +0000 UTC m=+40.245817867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:52:49.908936 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:49.908902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:49.908950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:49.909012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:49.909041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909051 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909117 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:57.909098017 +0000 UTC m=+48.278380474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909154 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909177 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909243 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:57.909226855 +0000 UTC m=+48.278509318 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909258 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909312 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:52:57.90929781 +0000 UTC m=+48.278580265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909321 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:49.909624 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:49.909367 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:52:57.909355381 +0000 UTC m=+48.278637846 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:52:52.413466 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.413427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" event={"ID":"9d8ac45a-7bb3-4a8a-899a-dde2387958a7","Type":"ContainerStarted","Data":"e4c862e4895cafc05a8e2e6a39a3179f4913d2d84325d8f1bf58d6dd0a04007b"} Apr 16 14:52:52.414785 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.414725 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-85ccl" event={"ID":"4a74642f-0b47-4e56-931c-041808066f04","Type":"ContainerStarted","Data":"7db6ca7d27919d9bbbb96ccd1d60228fef01e35837243d1fc721cb5b8acaae85"} Apr 16 14:52:52.414910 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.414857 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:52:52.416084 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.416067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" event={"ID":"43ed3899-5be3-4421-9337-d130a677964b","Type":"ContainerStarted","Data":"9a35db43c3eb45c0721ea6cb0561508eb55bb14521e44024d3fc652a21dbe99d"} Apr 16 14:52:52.416279 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.416244 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:52.417593 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.417551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" event={"ID":"a6e69161-e13e-41f0-88c7-8ce118b596e5","Type":"ContainerStarted","Data":"65e9f1fcedd478c2c3b142b82e1649a9a7b48d117711b7c24e1d3b0f05cae37d"} Apr 16 14:52:52.418173 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.418153 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:52:52.418957 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.418938 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kjjnh" event={"ID":"0892e381-08bb-4454-99af-9dd414b35525","Type":"ContainerStarted","Data":"4a64d5387c72c4f977cf4d542043c7a5e7167bd1c7242e29111b52305e94c73b"} Apr 16 14:52:52.430869 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.430824 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-85ccl" podStartSLOduration=34.736385022 podStartE2EDuration="42.430810182s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.189766406 +0000 UTC m=+34.559048862" lastFinishedPulling="2026-04-16 14:52:51.884191567 +0000 UTC m=+42.253474022" observedRunningTime="2026-04-16 14:52:52.429638609 +0000 UTC m=+42.798921082" watchObservedRunningTime="2026-04-16 14:52:52.430810182 +0000 UTC m=+42.800092652" Apr 16 14:52:52.445561 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.445519 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kjjnh" podStartSLOduration=33.578466082 podStartE2EDuration="41.445507841s" podCreationTimestamp="2026-04-16 14:52:11 +0000 UTC" firstStartedPulling="2026-04-16 14:52:44.299056926 +0000 UTC m=+34.668339391" lastFinishedPulling="2026-04-16 14:52:52.1660987 +0000 UTC m=+42.535381150" observedRunningTime="2026-04-16 14:52:52.4450593 +0000 UTC m=+42.814341773" watchObservedRunningTime="2026-04-16 14:52:52.445507841 +0000 UTC m=+42.814790313" Apr 16 14:52:52.458633 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.458601 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" podStartSLOduration=11.372366235 podStartE2EDuration="20.458591632s" podCreationTimestamp="2026-04-16 14:52:32 +0000 UTC" firstStartedPulling="2026-04-16 14:52:42.797500199 +0000 UTC m=+33.166782649" lastFinishedPulling="2026-04-16 14:52:51.883725592 +0000 UTC m=+42.253008046" observedRunningTime="2026-04-16 14:52:52.457942398 +0000 UTC m=+42.827224869" watchObservedRunningTime="2026-04-16 14:52:52.458591632 +0000 UTC m=+42.827874082" Apr 16 14:52:52.475171 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:52.475099 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" podStartSLOduration=11.400704932 podStartE2EDuration="20.475084674s" podCreationTimestamp="2026-04-16 14:52:32 +0000 UTC" firstStartedPulling="2026-04-16 14:52:42.80935357 +0000 UTC m=+33.178636021" lastFinishedPulling="2026-04-16 14:52:51.883733309 +0000 UTC m=+42.253015763" observedRunningTime="2026-04-16 14:52:52.473968165 +0000 UTC m=+42.843250643" watchObservedRunningTime="2026-04-16 14:52:52.475084674 +0000 UTC m=+42.844367148" Apr 16 14:52:54.426990 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:54.426962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" event={"ID":"9d8ac45a-7bb3-4a8a-899a-dde2387958a7","Type":"ContainerStarted","Data":"31ad13000fd8f7f286e4a37b66afaaccef90674c13a07dccadb925f697c55e2f"} Apr 16 14:52:55.430826 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:55.430751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" event={"ID":"9d8ac45a-7bb3-4a8a-899a-dde2387958a7","Type":"ContainerStarted","Data":"91f3d9bb17d0c0f7186ec746ec20052396dcb4cd6d73c8eccb036029438885bb"} Apr 16 14:52:55.452037 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:55.451997 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" podStartSLOduration=11.969366064 podStartE2EDuration="23.451985912s" podCreationTimestamp="2026-04-16 14:52:32 +0000 UTC" firstStartedPulling="2026-04-16 14:52:42.802076416 +0000 UTC m=+33.171358870" lastFinishedPulling="2026-04-16 14:52:54.284696268 +0000 UTC m=+44.653978718" observedRunningTime="2026-04-16 14:52:55.450437932 +0000 UTC m=+45.819720402" watchObservedRunningTime="2026-04-16 14:52:55.451985912 +0000 UTC m=+45.821268384" Apr 16 14:52:57.970519 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:57.970484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:52:57.970519 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:57.970524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:57.970551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:52:57.970570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970624 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970647 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970657 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970672 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970626 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970691 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:53:13.970676903 +0000 UTC m=+64.339959353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970732 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:13.970721611 +0000 UTC m=+64.340004061 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970742 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:13.97073616 +0000 UTC m=+64.340018610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:52:57.971011 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:52:57.970752 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:13.970746361 +0000 UTC m=+64.340028812 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:53:06.338702 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:06.338676 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-75cxs" Apr 16 14:53:13.974814 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:13.974781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:53:13.974814 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:13.974821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:13.974851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:13.974869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.974935 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.974959 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.974969 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.974935 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.974999 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:53:45.974984711 +0000 UTC m=+96.344267167 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.975017 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:53:45.975006827 +0000 UTC m=+96.344289277 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.974942 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.975027 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:45.975022561 +0000 UTC m=+96.344305011 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:53:13.975346 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:13.975076 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:45.975064893 +0000 UTC m=+96.344347343 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:53:15.987731 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:15.987692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:53:15.988127 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:15.987804 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:15.988127 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:15.987863 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:54:19.987850321 +0000 UTC m=+130.357132771 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : secret "metrics-daemon-secret" not found Apr 16 14:53:23.424262 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:23.424228 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-85ccl" Apr 16 14:53:45.983567 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:45.983529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:53:45.983567 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:45.983572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:45.983614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:53:45.983642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983683 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983737 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983756 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:54:49.983738838 +0000 UTC m=+160.353021300 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983758 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983799 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:54:49.983786124 +0000 UTC m=+160.353068574 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983757 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983814 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983817 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:49.983807797 +0000 UTC m=+160.353090247 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:53:45.984005 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:53:45.983851 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:49.983830784 +0000 UTC m=+160.353113234 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:54:20.017903 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:20.017867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:54:20.018354 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:20.017995 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:54:20.018354 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:20.018063 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs podName:012eecca-9f9b-4a13-8adc-05b585fd794b nodeName:}" failed. No retries permitted until 2026-04-16 14:56:22.018048331 +0000 UTC m=+252.387330782 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs") pod "network-metrics-daemon-qf2x7" (UID: "012eecca-9f9b-4a13-8adc-05b585fd794b") : secret "metrics-daemon-secret" not found Apr 16 14:54:43.628960 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:43.628932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4vrpg_55ad67c5-7bb1-4c6c-8c58-869beff80d7f/dns-node-resolver/0.log" Apr 16 14:54:44.829069 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:44.829038 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pnmnk_38c4c3ee-509f-467b-b050-9b723bfce014/node-ca/0.log" Apr 16 14:54:45.106279 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:45.106249 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" podUID="8b8f0df0-4d3f-4fdd-894b-fd928f0d7481" Apr 16 14:54:45.126897 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:45.126864 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" podUID="1c1d77f5-df10-4280-99eb-47d03799e2f7" Apr 16 14:54:45.135007 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:45.134986 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cvpwl" podUID="5803f957-e9d1-4ccf-a732-54889272611a" Apr 16 14:54:45.141147 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:45.141129 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-m6qlb" podUID="7badd0a7-a664-4046-8cb3-c1bf570dc29b" Apr 16 14:54:45.685726 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:45.685701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:54:45.685872 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:45.685814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m6qlb" Apr 16 14:54:45.685951 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:45.685938 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:54:45.686039 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:45.686021 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:54:46.210506 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:46.210475 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qf2x7" podUID="012eecca-9f9b-4a13-8adc-05b585fd794b" Apr 16 14:54:50.016742 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:50.016703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:54:50.016742 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:50.016746 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.016838 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.016840 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:50.016868 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.016890 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert podName:5803f957-e9d1-4ccf-a732-54889272611a nodeName:}" failed. No retries permitted until 2026-04-16 14:56:52.016877045 +0000 UTC m=+282.386159495 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert") pod "ingress-canary-cvpwl" (UID: "5803f957-e9d1-4ccf-a732-54889272611a") : secret "canary-serving-cert" not found Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.016905 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls podName:7badd0a7-a664-4046-8cb3-c1bf570dc29b nodeName:}" failed. No retries permitted until 2026-04-16 14:56:52.016898498 +0000 UTC m=+282.386180948 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls") pod "dns-default-m6qlb" (UID: "7badd0a7-a664-4046-8cb3-c1bf570dc29b") : secret "dns-default-metrics-tls" not found Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:50.016924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") pod \"image-registry-d5bf687d7-xq82d\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.016933 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.016976 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert podName:8b8f0df0-4d3f-4fdd-894b-fd928f0d7481 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:52.016964868 +0000 UTC m=+282.386247318 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-5wrz2" (UID: "8b8f0df0-4d3f-4fdd-894b-fd928f0d7481") : secret "networking-console-plugin-cert" not found Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.016987 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.016994 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d5bf687d7-xq82d: secret "image-registry-tls" not found Apr 16 14:54:50.017136 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:54:50.017019 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls podName:1c1d77f5-df10-4280-99eb-47d03799e2f7 nodeName:}" failed. No retries permitted until 2026-04-16 14:56:52.017013123 +0000 UTC m=+282.386295573 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls") pod "image-registry-d5bf687d7-xq82d" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7") : secret "image-registry-tls" not found Apr 16 14:54:52.388764 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.388719 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" podUID="a6e69161-e13e-41f0-88c7-8ce118b596e5" containerName="addon-agent" probeResult="failure" output="Get \"http://10.132.0.6:8000/healthz\": dial tcp 10.132.0.6:8000: connect: connection refused" Apr 16 14:54:52.398663 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.398637 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" podUID="43ed3899-5be3-4421-9337-d130a677964b" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/healthz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 16 14:54:52.416580 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.416557 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" podUID="43ed3899-5be3-4421-9337-d130a677964b" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/readyz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 16 14:54:52.703000 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.702930 2575 generic.go:358] "Generic (PLEG): container finished" podID="43ed3899-5be3-4421-9337-d130a677964b" containerID="9a35db43c3eb45c0721ea6cb0561508eb55bb14521e44024d3fc652a21dbe99d" exitCode=1 Apr 16 14:54:52.703000 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.702991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" event={"ID":"43ed3899-5be3-4421-9337-d130a677964b","Type":"ContainerDied","Data":"9a35db43c3eb45c0721ea6cb0561508eb55bb14521e44024d3fc652a21dbe99d"} Apr 16 14:54:52.703345 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.703328 2575 scope.go:117] "RemoveContainer" containerID="9a35db43c3eb45c0721ea6cb0561508eb55bb14521e44024d3fc652a21dbe99d" Apr 16 14:54:52.704126 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.704108 2575 generic.go:358] "Generic (PLEG): container finished" podID="a6e69161-e13e-41f0-88c7-8ce118b596e5" containerID="65e9f1fcedd478c2c3b142b82e1649a9a7b48d117711b7c24e1d3b0f05cae37d" exitCode=255 Apr 16 14:54:52.704241 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.704135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" event={"ID":"a6e69161-e13e-41f0-88c7-8ce118b596e5","Type":"ContainerDied","Data":"65e9f1fcedd478c2c3b142b82e1649a9a7b48d117711b7c24e1d3b0f05cae37d"} Apr 16 14:54:52.704376 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:52.704361 2575 scope.go:117] "RemoveContainer" containerID="65e9f1fcedd478c2c3b142b82e1649a9a7b48d117711b7c24e1d3b0f05cae37d" Apr 16 14:54:53.707829 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:53.707792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" event={"ID":"43ed3899-5be3-4421-9337-d130a677964b","Type":"ContainerStarted","Data":"b8a46bb06093807b1dbc856eb794447f2edd0f7d1fbab3b9a41bc8265e98626a"} Apr 16 14:54:53.708248 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:53.708116 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:54:53.709063 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:53.709040 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-57f8b5cc7c-rbr8w" Apr 16 14:54:53.709414 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:53.709397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6587d65555-ws6kd" event={"ID":"a6e69161-e13e-41f0-88c7-8ce118b596e5","Type":"ContainerStarted","Data":"9999f59714e391dc59bbd9af8fe62a82e1c4bbd912f3016e812804b492438802"} Apr 16 14:54:59.200775 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:54:59.200713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:55:09.347270 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.347234 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rghv9"] Apr 16 14:55:09.350384 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.350368 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.353638 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.353613 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:09.353747 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.353694 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:09.354438 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.354418 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:55:09.354532 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.354511 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bg2mj\"" Apr 16 14:55:09.354596 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.354568 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:55:09.360543 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.360518 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rghv9"] Apr 16 14:55:09.444781 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.444755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9a74039-5a10-491d-97b9-0057432158ab-data-volume\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.444972 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.444799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e9a74039-5a10-491d-97b9-0057432158ab-crio-socket\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.444972 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.444894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9a74039-5a10-491d-97b9-0057432158ab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.444972 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.444944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qccjg\" (UniqueName: \"kubernetes.io/projected/e9a74039-5a10-491d-97b9-0057432158ab-kube-api-access-qccjg\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.445158 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.445014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e9a74039-5a10-491d-97b9-0057432158ab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.546080 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.546053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e9a74039-5a10-491d-97b9-0057432158ab-crio-socket\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.546158 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.546106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9a74039-5a10-491d-97b9-0057432158ab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.546158 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.546129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qccjg\" (UniqueName: \"kubernetes.io/projected/e9a74039-5a10-491d-97b9-0057432158ab-kube-api-access-qccjg\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.546158 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.546152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e9a74039-5a10-491d-97b9-0057432158ab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.546286 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.546164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e9a74039-5a10-491d-97b9-0057432158ab-crio-socket\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.546286 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.546178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9a74039-5a10-491d-97b9-0057432158ab-data-volume\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.546722 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.546705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e9a74039-5a10-491d-97b9-0057432158ab-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.546990 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.546975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9a74039-5a10-491d-97b9-0057432158ab-data-volume\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.549460 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.549440 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9a74039-5a10-491d-97b9-0057432158ab-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.553565 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.553545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qccjg\" (UniqueName: \"kubernetes.io/projected/e9a74039-5a10-491d-97b9-0057432158ab-kube-api-access-qccjg\") pod \"insights-runtime-extractor-rghv9\" (UID: \"e9a74039-5a10-491d-97b9-0057432158ab\") " pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.659379 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.659333 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rghv9" Apr 16 14:55:09.780613 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:09.780588 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rghv9"] Apr 16 14:55:09.783186 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:55:09.783162 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a74039_5a10_491d_97b9_0057432158ab.slice/crio-5cdb3cf44db9fbccc89f2983d7ca3350a324e47e2580e5a4d73f1e2fae7f5f0f WatchSource:0}: Error finding container 5cdb3cf44db9fbccc89f2983d7ca3350a324e47e2580e5a4d73f1e2fae7f5f0f: Status 404 returned error can't find the container with id 5cdb3cf44db9fbccc89f2983d7ca3350a324e47e2580e5a4d73f1e2fae7f5f0f Apr 16 14:55:10.747369 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:10.747295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rghv9" event={"ID":"e9a74039-5a10-491d-97b9-0057432158ab","Type":"ContainerStarted","Data":"f06afc1d0ea309cd28e1ca2069b5ea6ecc1b1df19146695296edea581166249a"} Apr 16 14:55:10.747369 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:10.747331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rghv9" event={"ID":"e9a74039-5a10-491d-97b9-0057432158ab","Type":"ContainerStarted","Data":"8eb45751bd866279f8a8b638703313703c2abc2699796e1d709145c78263ba6d"} Apr 16 14:55:10.747369 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:10.747339 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rghv9" event={"ID":"e9a74039-5a10-491d-97b9-0057432158ab","Type":"ContainerStarted","Data":"5cdb3cf44db9fbccc89f2983d7ca3350a324e47e2580e5a4d73f1e2fae7f5f0f"} Apr 16 14:55:11.751076 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:11.751044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rghv9" event={"ID":"e9a74039-5a10-491d-97b9-0057432158ab","Type":"ContainerStarted","Data":"10f30ff7f262db53794956586622a9fc2937707170f8452d2638682aaba2e2b8"} Apr 16 14:55:11.770965 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:11.770924 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rghv9" podStartSLOduration=0.997608973 podStartE2EDuration="2.770912724s" podCreationTimestamp="2026-04-16 14:55:09 +0000 UTC" firstStartedPulling="2026-04-16 14:55:09.841347129 +0000 UTC m=+180.210629582" lastFinishedPulling="2026-04-16 14:55:11.61465087 +0000 UTC m=+181.983933333" observedRunningTime="2026-04-16 14:55:11.768559485 +0000 UTC m=+182.137841957" watchObservedRunningTime="2026-04-16 14:55:11.770912724 +0000 UTC m=+182.140195195" Apr 16 14:55:21.981579 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.981549 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qkrz2"] Apr 16 14:55:21.984650 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.984630 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:21.988116 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.988087 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:21.988116 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.988114 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:21.988312 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.988134 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:21.988312 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.988168 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:55:21.989448 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.989430 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:55:21.989567 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.989444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p6x8s\"" Apr 16 14:55:21.989567 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:21.989429 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:22.040248 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-wtmp\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.040409 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-textfile\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.040409 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-tls\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.040409 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.040409 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-sys\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.040409 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040407 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-root\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.040632 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-accelerators-collector-config\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.040632 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4s7z\" (UniqueName: \"kubernetes.io/projected/89195b92-e59e-4af4-b4c4-06dc7d7e796e-kube-api-access-q4s7z\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.040632 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.040544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89195b92-e59e-4af4-b4c4-06dc7d7e796e-metrics-client-ca\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141271 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-tls\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141360 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141360 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-sys\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141360 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-sys\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141466 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-root\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141466 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-accelerators-collector-config\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141466 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-root\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141598 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4s7z\" (UniqueName: \"kubernetes.io/projected/89195b92-e59e-4af4-b4c4-06dc7d7e796e-kube-api-access-q4s7z\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141656 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89195b92-e59e-4af4-b4c4-06dc7d7e796e-metrics-client-ca\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141706 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-wtmp\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141832 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-wtmp\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.141904 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-textfile\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.146419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.141932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-accelerators-collector-config\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.146419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.142393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-textfile\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.146419 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.142509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89195b92-e59e-4af4-b4c4-06dc7d7e796e-metrics-client-ca\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.146942 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.146924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.147017 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.146998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/89195b92-e59e-4af4-b4c4-06dc7d7e796e-node-exporter-tls\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.149268 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.149251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4s7z\" (UniqueName: \"kubernetes.io/projected/89195b92-e59e-4af4-b4c4-06dc7d7e796e-kube-api-access-q4s7z\") pod \"node-exporter-qkrz2\" (UID: \"89195b92-e59e-4af4-b4c4-06dc7d7e796e\") " pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.293051 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.293004 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qkrz2" Apr 16 14:55:22.302186 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:55:22.302163 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89195b92_e59e_4af4_b4c4_06dc7d7e796e.slice/crio-5a87df65afa0feb36b43678899afa2a2894d1bc4e3a912bc2b2c0fdd2affbde4 WatchSource:0}: Error finding container 5a87df65afa0feb36b43678899afa2a2894d1bc4e3a912bc2b2c0fdd2affbde4: Status 404 returned error can't find the container with id 5a87df65afa0feb36b43678899afa2a2894d1bc4e3a912bc2b2c0fdd2affbde4 Apr 16 14:55:22.776393 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:22.776360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkrz2" event={"ID":"89195b92-e59e-4af4-b4c4-06dc7d7e796e","Type":"ContainerStarted","Data":"5a87df65afa0feb36b43678899afa2a2894d1bc4e3a912bc2b2c0fdd2affbde4"} Apr 16 14:55:23.780055 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:23.780026 2575 generic.go:358] "Generic (PLEG): container finished" podID="89195b92-e59e-4af4-b4c4-06dc7d7e796e" containerID="1524838e5eb513312ec9cd5f503deed1277b5aa166d7e32f41f1160394aaaeeb" exitCode=0 Apr 16 14:55:23.780510 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:23.780080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkrz2" event={"ID":"89195b92-e59e-4af4-b4c4-06dc7d7e796e","Type":"ContainerDied","Data":"1524838e5eb513312ec9cd5f503deed1277b5aa166d7e32f41f1160394aaaeeb"} Apr 16 14:55:24.784865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:24.784827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkrz2" event={"ID":"89195b92-e59e-4af4-b4c4-06dc7d7e796e","Type":"ContainerStarted","Data":"a4a80226f065c0672e76a0a649819f5775154a3ccb8a9c2d76713111660f35fe"} Apr 16 14:55:24.784865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:24.784867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qkrz2" event={"ID":"89195b92-e59e-4af4-b4c4-06dc7d7e796e","Type":"ContainerStarted","Data":"67b0f524a133f013a0d5222c6ba80988333726b7f7a0b71b2d1d100a5fb0e527"} Apr 16 14:55:24.803881 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:24.803834 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qkrz2" podStartSLOduration=3.114892757 podStartE2EDuration="3.80382133s" podCreationTimestamp="2026-04-16 14:55:21 +0000 UTC" firstStartedPulling="2026-04-16 14:55:22.304321868 +0000 UTC m=+192.673604320" lastFinishedPulling="2026-04-16 14:55:22.993250442 +0000 UTC m=+193.362532893" observedRunningTime="2026-04-16 14:55:24.802118081 +0000 UTC m=+195.171400554" watchObservedRunningTime="2026-04-16 14:55:24.80382133 +0000 UTC m=+195.173103802" Apr 16 14:55:31.001945 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.001915 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d5bf687d7-xq82d"] Apr 16 14:55:31.002306 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:55:31.002058 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" podUID="1c1d77f5-df10-4280-99eb-47d03799e2f7" Apr 16 14:55:31.801246 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.801196 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:55:31.805196 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.805178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:55:31.908865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.908838 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-image-registry-private-configuration\") pod \"1c1d77f5-df10-4280-99eb-47d03799e2f7\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " Apr 16 14:55:31.908865 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.908868 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-trusted-ca\") pod \"1c1d77f5-df10-4280-99eb-47d03799e2f7\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " Apr 16 14:55:31.909057 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.908886 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phgzp\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-kube-api-access-phgzp\") pod \"1c1d77f5-df10-4280-99eb-47d03799e2f7\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " Apr 16 14:55:31.909057 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.908914 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-bound-sa-token\") pod \"1c1d77f5-df10-4280-99eb-47d03799e2f7\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " Apr 16 14:55:31.909057 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.908943 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c1d77f5-df10-4280-99eb-47d03799e2f7-ca-trust-extracted\") pod \"1c1d77f5-df10-4280-99eb-47d03799e2f7\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " Apr 16 14:55:31.909057 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.908976 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-installation-pull-secrets\") pod \"1c1d77f5-df10-4280-99eb-47d03799e2f7\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " Apr 16 14:55:31.909057 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.909003 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-certificates\") pod \"1c1d77f5-df10-4280-99eb-47d03799e2f7\" (UID: \"1c1d77f5-df10-4280-99eb-47d03799e2f7\") " Apr 16 14:55:31.909520 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.909466 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c1d77f5-df10-4280-99eb-47d03799e2f7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1c1d77f5-df10-4280-99eb-47d03799e2f7" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:31.909520 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.909501 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1c1d77f5-df10-4280-99eb-47d03799e2f7" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:31.909658 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.909583 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1c1d77f5-df10-4280-99eb-47d03799e2f7" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:55:31.911333 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.911309 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1c1d77f5-df10-4280-99eb-47d03799e2f7" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:31.911431 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.911373 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1c1d77f5-df10-4280-99eb-47d03799e2f7" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:55:31.911505 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.911484 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-kube-api-access-phgzp" (OuterVolumeSpecName: "kube-api-access-phgzp") pod "1c1d77f5-df10-4280-99eb-47d03799e2f7" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7"). InnerVolumeSpecName "kube-api-access-phgzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:31.911563 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:31.911548 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1c1d77f5-df10-4280-99eb-47d03799e2f7" (UID: "1c1d77f5-df10-4280-99eb-47d03799e2f7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:55:32.009762 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.009742 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-trusted-ca\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 14:55:32.009762 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.009762 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phgzp\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-kube-api-access-phgzp\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 14:55:32.010048 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.009772 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-bound-sa-token\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 14:55:32.010048 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.009780 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1c1d77f5-df10-4280-99eb-47d03799e2f7-ca-trust-extracted\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 14:55:32.010048 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.009789 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-installation-pull-secrets\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 14:55:32.010048 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.009798 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-certificates\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 14:55:32.010048 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.009807 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1c1d77f5-df10-4280-99eb-47d03799e2f7-image-registry-private-configuration\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 14:55:32.415603 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.415570 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" podUID="9d8ac45a-7bb3-4a8a-899a-dde2387958a7" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:55:32.803123 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.803095 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d5bf687d7-xq82d" Apr 16 14:55:32.833815 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.833789 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d5bf687d7-xq82d"] Apr 16 14:55:32.836467 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.836444 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-d5bf687d7-xq82d"] Apr 16 14:55:32.917041 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:32.917018 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1c1d77f5-df10-4280-99eb-47d03799e2f7-registry-tls\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 14:55:34.203948 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:34.203918 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1d77f5-df10-4280-99eb-47d03799e2f7" path="/var/lib/kubelet/pods/1c1d77f5-df10-4280-99eb-47d03799e2f7/volumes" Apr 16 14:55:42.415978 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:42.415937 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" podUID="9d8ac45a-7bb3-4a8a-899a-dde2387958a7" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:55:52.415634 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:52.415593 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" podUID="9d8ac45a-7bb3-4a8a-899a-dde2387958a7" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:55:52.416000 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:52.415668 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" Apr 16 14:55:52.416145 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:52.416105 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"91f3d9bb17d0c0f7186ec746ec20052396dcb4cd6d73c8eccb036029438885bb"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:55:52.416189 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:52.416175 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" podUID="9d8ac45a-7bb3-4a8a-899a-dde2387958a7" containerName="service-proxy" containerID="cri-o://91f3d9bb17d0c0f7186ec746ec20052396dcb4cd6d73c8eccb036029438885bb" gracePeriod=30 Apr 16 14:55:52.851928 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:52.851895 2575 generic.go:358] "Generic (PLEG): container finished" podID="9d8ac45a-7bb3-4a8a-899a-dde2387958a7" containerID="91f3d9bb17d0c0f7186ec746ec20052396dcb4cd6d73c8eccb036029438885bb" exitCode=2 Apr 16 14:55:52.852139 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:52.851959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" event={"ID":"9d8ac45a-7bb3-4a8a-899a-dde2387958a7","Type":"ContainerDied","Data":"91f3d9bb17d0c0f7186ec746ec20052396dcb4cd6d73c8eccb036029438885bb"} Apr 16 14:55:52.852139 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:55:52.851985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5466b455c-wlw8h" event={"ID":"9d8ac45a-7bb3-4a8a-899a-dde2387958a7","Type":"ContainerStarted","Data":"421c0e79ecf32dee6b40f07c063e037245807dc2b3d4ad5fc3d982d03b459f40"} Apr 16 14:56:07.532458 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:07.532430 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4vrpg_55ad67c5-7bb1-4c6c-8c58-869beff80d7f/dns-node-resolver/0.log" Apr 16 14:56:22.045885 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:22.045844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:56:22.048168 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:22.048143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/012eecca-9f9b-4a13-8adc-05b585fd794b-metrics-certs\") pod \"network-metrics-daemon-qf2x7\" (UID: \"012eecca-9f9b-4a13-8adc-05b585fd794b\") " pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:56:22.305191 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:22.305131 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fb76v\"" Apr 16 14:56:22.312406 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:22.312390 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qf2x7" Apr 16 14:56:22.419873 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:22.419846 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qf2x7"] Apr 16 14:56:22.423821 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:56:22.423795 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod012eecca_9f9b_4a13_8adc_05b585fd794b.slice/crio-5b8e03d243ff0317d32ce306cf4a6b29abbd7df84f6061462f9aea5ed4c67d6f WatchSource:0}: Error finding container 5b8e03d243ff0317d32ce306cf4a6b29abbd7df84f6061462f9aea5ed4c67d6f: Status 404 returned error can't find the container with id 5b8e03d243ff0317d32ce306cf4a6b29abbd7df84f6061462f9aea5ed4c67d6f Apr 16 14:56:22.923649 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:22.923603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qf2x7" event={"ID":"012eecca-9f9b-4a13-8adc-05b585fd794b","Type":"ContainerStarted","Data":"5b8e03d243ff0317d32ce306cf4a6b29abbd7df84f6061462f9aea5ed4c67d6f"} Apr 16 14:56:23.929674 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:23.929638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qf2x7" event={"ID":"012eecca-9f9b-4a13-8adc-05b585fd794b","Type":"ContainerStarted","Data":"c2c92b7a34c93227eb39e253eceb92ff6b02cc1288749b73de16a25fa4f785e1"} Apr 16 14:56:23.929674 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:23.929673 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qf2x7" event={"ID":"012eecca-9f9b-4a13-8adc-05b585fd794b","Type":"ContainerStarted","Data":"c56e47248bfced907cf953e2b4118e6bd341071a54b40911262a2bb75b1767ec"} Apr 16 14:56:48.686625 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:56:48.686579 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cvpwl" podUID="5803f957-e9d1-4ccf-a732-54889272611a" Apr 16 14:56:48.686625 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:56:48.686617 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" podUID="8b8f0df0-4d3f-4fdd-894b-fd928f0d7481" Apr 16 14:56:48.687006 ip-10-0-130-229 kubenswrapper[2575]: E0416 14:56:48.686617 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-m6qlb" podUID="7badd0a7-a664-4046-8cb3-c1bf570dc29b" Apr 16 14:56:48.989786 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:48.989715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m6qlb" Apr 16 14:56:48.989786 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:48.989752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:56:48.989786 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:48.989779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:56:52.044899 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.044870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:56:52.045344 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.044909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:56:52.045344 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.044949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:56:52.047189 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.047167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7badd0a7-a664-4046-8cb3-c1bf570dc29b-metrics-tls\") pod \"dns-default-m6qlb\" (UID: \"7badd0a7-a664-4046-8cb3-c1bf570dc29b\") " pod="openshift-dns/dns-default-m6qlb" Apr 16 14:56:52.047357 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.047341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5803f957-e9d1-4ccf-a732-54889272611a-cert\") pod \"ingress-canary-cvpwl\" (UID: \"5803f957-e9d1-4ccf-a732-54889272611a\") " pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:56:52.047443 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.047423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8b8f0df0-4d3f-4fdd-894b-fd928f0d7481-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-5wrz2\" (UID: \"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:56:52.294255 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.294228 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-m2hzf\"" Apr 16 14:56:52.294255 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.294246 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9hwfz\"" Apr 16 14:56:52.294430 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.294232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cvt8w\"" Apr 16 14:56:52.300636 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.300579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" Apr 16 14:56:52.300636 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.300609 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m6qlb" Apr 16 14:56:52.300763 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.300588 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cvpwl" Apr 16 14:56:52.428051 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.428002 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qf2x7" podStartSLOduration=281.554396472 podStartE2EDuration="4m42.427963013s" podCreationTimestamp="2026-04-16 14:52:10 +0000 UTC" firstStartedPulling="2026-04-16 14:56:22.426036482 +0000 UTC m=+252.795318933" lastFinishedPulling="2026-04-16 14:56:23.299603023 +0000 UTC m=+253.668885474" observedRunningTime="2026-04-16 14:56:23.946143036 +0000 UTC m=+254.315425510" watchObservedRunningTime="2026-04-16 14:56:52.427963013 +0000 UTC m=+282.797245489" Apr 16 14:56:52.429441 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.429419 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2"] Apr 16 14:56:52.434617 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:56:52.434593 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b8f0df0_4d3f_4fdd_894b_fd928f0d7481.slice/crio-bbefdd1ef712e1495eb7916bb15b686a701f4bf18357bdbee4800ab97ef2993f WatchSource:0}: Error finding container bbefdd1ef712e1495eb7916bb15b686a701f4bf18357bdbee4800ab97ef2993f: Status 404 returned error can't find the container with id bbefdd1ef712e1495eb7916bb15b686a701f4bf18357bdbee4800ab97ef2993f Apr 16 14:56:52.442345 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.442325 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m6qlb"] Apr 16 14:56:52.444393 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:56:52.444372 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7badd0a7_a664_4046_8cb3_c1bf570dc29b.slice/crio-8486e7f98b513f9a2021f55aa9626e08711f7e27b0db8e5a58638e6339bacf74 WatchSource:0}: Error finding container 8486e7f98b513f9a2021f55aa9626e08711f7e27b0db8e5a58638e6339bacf74: Status 404 returned error can't find the container with id 8486e7f98b513f9a2021f55aa9626e08711f7e27b0db8e5a58638e6339bacf74 Apr 16 14:56:52.460315 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:52.460288 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cvpwl"] Apr 16 14:56:52.462478 ip-10-0-130-229 kubenswrapper[2575]: W0416 14:56:52.462458 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5803f957_e9d1_4ccf_a732_54889272611a.slice/crio-3a2ef79cce5b1c3642b3f4bd0c7c9494d023ed1cc739706adefcd36528b7b0cb WatchSource:0}: Error finding container 3a2ef79cce5b1c3642b3f4bd0c7c9494d023ed1cc739706adefcd36528b7b0cb: Status 404 returned error can't find the container with id 3a2ef79cce5b1c3642b3f4bd0c7c9494d023ed1cc739706adefcd36528b7b0cb Apr 16 14:56:53.000621 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:53.000560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cvpwl" event={"ID":"5803f957-e9d1-4ccf-a732-54889272611a","Type":"ContainerStarted","Data":"3a2ef79cce5b1c3642b3f4bd0c7c9494d023ed1cc739706adefcd36528b7b0cb"} Apr 16 14:56:53.002089 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:53.002014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" event={"ID":"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481","Type":"ContainerStarted","Data":"bbefdd1ef712e1495eb7916bb15b686a701f4bf18357bdbee4800ab97ef2993f"} Apr 16 14:56:53.003821 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:53.003763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m6qlb" event={"ID":"7badd0a7-a664-4046-8cb3-c1bf570dc29b","Type":"ContainerStarted","Data":"8486e7f98b513f9a2021f55aa9626e08711f7e27b0db8e5a58638e6339bacf74"} Apr 16 14:56:55.010716 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:55.010684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cvpwl" event={"ID":"5803f957-e9d1-4ccf-a732-54889272611a","Type":"ContainerStarted","Data":"d5c7e731d162e0cab6d981379fb50ba3984cd6f0b25a681caf37df3da4d0e4dd"} Apr 16 14:56:55.012133 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:55.012104 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" event={"ID":"8b8f0df0-4d3f-4fdd-894b-fd928f0d7481","Type":"ContainerStarted","Data":"cd02cf76e34d5ce5e551018c3793ecb0c17c31bceb82dd6e021703f9c66a9c04"} Apr 16 14:56:55.013666 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:55.013640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m6qlb" event={"ID":"7badd0a7-a664-4046-8cb3-c1bf570dc29b","Type":"ContainerStarted","Data":"dd67068c06a093bbc2b89e1e370767a3d52d060301f9c91960bf2e2d2ad0ee2b"} Apr 16 14:56:55.013753 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:55.013674 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m6qlb" event={"ID":"7badd0a7-a664-4046-8cb3-c1bf570dc29b","Type":"ContainerStarted","Data":"f76f292186e4e2097af2df5c37e176c92847341caaef96e1a5f155ae7133cb6b"} Apr 16 14:56:55.013802 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:55.013773 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-m6qlb" Apr 16 14:56:55.027174 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:55.027127 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cvpwl" podStartSLOduration=251.127284143 podStartE2EDuration="4m13.027116364s" podCreationTimestamp="2026-04-16 14:52:42 +0000 UTC" firstStartedPulling="2026-04-16 14:56:52.463980663 +0000 UTC m=+282.833263117" lastFinishedPulling="2026-04-16 14:56:54.363812877 +0000 UTC m=+284.733095338" observedRunningTime="2026-04-16 14:56:55.025958966 +0000 UTC m=+285.395241437" watchObservedRunningTime="2026-04-16 14:56:55.027116364 +0000 UTC m=+285.396398835" Apr 16 14:56:55.041778 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:55.041724 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m6qlb" podStartSLOduration=251.128616203 podStartE2EDuration="4m13.041707515s" podCreationTimestamp="2026-04-16 14:52:42 +0000 UTC" firstStartedPulling="2026-04-16 14:56:52.446158475 +0000 UTC m=+282.815440926" lastFinishedPulling="2026-04-16 14:56:54.359249774 +0000 UTC m=+284.728532238" observedRunningTime="2026-04-16 14:56:55.040493732 +0000 UTC m=+285.409776205" watchObservedRunningTime="2026-04-16 14:56:55.041707515 +0000 UTC m=+285.410989988" Apr 16 14:56:55.058294 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:56:55.058256 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-5wrz2" podStartSLOduration=265.136218165 podStartE2EDuration="4m27.058243724s" podCreationTimestamp="2026-04-16 14:52:28 +0000 UTC" firstStartedPulling="2026-04-16 14:56:52.436941225 +0000 UTC m=+282.806223679" lastFinishedPulling="2026-04-16 14:56:54.358966787 +0000 UTC m=+284.728249238" observedRunningTime="2026-04-16 14:56:55.05755549 +0000 UTC m=+285.426837961" watchObservedRunningTime="2026-04-16 14:56:55.058243724 +0000 UTC m=+285.427526195" Apr 16 14:57:05.017997 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:57:05.017968 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m6qlb" Apr 16 14:57:10.092103 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:57:10.092076 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 14:57:10.092587 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:57:10.092144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 14:57:10.098424 ip-10-0-130-229 kubenswrapper[2575]: I0416 14:57:10.098316 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 15:02:10.107678 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:02:10.107651 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:02:10.109001 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:02:10.108982 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:03:31.398824 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.398791 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-pvs8k"] Apr 16 15:03:31.401902 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.401888 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:31.404637 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.404619 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:03:31.404637 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.404627 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-ctvmq\"" Apr 16 15:03:31.406002 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.405987 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:03:31.406058 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.405987 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 15:03:31.410920 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.410901 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-pvs8k"] Apr 16 15:03:31.546160 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.546134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/ab76072a-d1e3-4b63-a234-fed084c535b0-kube-api-access-kf2bb\") pod \"kserve-controller-manager-7669bdc57-pvs8k\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:31.546291 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.546173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert\") pod \"kserve-controller-manager-7669bdc57-pvs8k\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:31.647061 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.647038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert\") pod \"kserve-controller-manager-7669bdc57-pvs8k\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:31.647175 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.647096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/ab76072a-d1e3-4b63-a234-fed084c535b0-kube-api-access-kf2bb\") pod \"kserve-controller-manager-7669bdc57-pvs8k\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:31.647242 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:03:31.647183 2575 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 15:03:31.647293 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:03:31.647275 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert podName:ab76072a-d1e3-4b63-a234-fed084c535b0 nodeName:}" failed. No retries permitted until 2026-04-16 15:03:32.147255108 +0000 UTC m=+682.516537563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert") pod "kserve-controller-manager-7669bdc57-pvs8k" (UID: "ab76072a-d1e3-4b63-a234-fed084c535b0") : secret "kserve-webhook-server-cert" not found Apr 16 15:03:31.657168 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:31.657110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/ab76072a-d1e3-4b63-a234-fed084c535b0-kube-api-access-kf2bb\") pod \"kserve-controller-manager-7669bdc57-pvs8k\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:32.149987 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:32.149960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert\") pod \"kserve-controller-manager-7669bdc57-pvs8k\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:32.152161 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:32.152144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert\") pod \"kserve-controller-manager-7669bdc57-pvs8k\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:32.312203 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:32.312178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:32.423235 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:32.423154 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-pvs8k"] Apr 16 15:03:32.426308 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:03:32.426272 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab76072a_d1e3_4b63_a234_fed084c535b0.slice/crio-54f4d33d0321ad65728adaa723e9cdb418d0c7a203f7e5cadfc06c29e3a63f6e WatchSource:0}: Error finding container 54f4d33d0321ad65728adaa723e9cdb418d0c7a203f7e5cadfc06c29e3a63f6e: Status 404 returned error can't find the container with id 54f4d33d0321ad65728adaa723e9cdb418d0c7a203f7e5cadfc06c29e3a63f6e Apr 16 15:03:32.427970 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:32.427952 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:03:33.002793 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:33.002755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" event={"ID":"ab76072a-d1e3-4b63-a234-fed084c535b0","Type":"ContainerStarted","Data":"54f4d33d0321ad65728adaa723e9cdb418d0c7a203f7e5cadfc06c29e3a63f6e"} Apr 16 15:03:36.012083 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:36.012049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" event={"ID":"ab76072a-d1e3-4b63-a234-fed084c535b0","Type":"ContainerStarted","Data":"c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd"} Apr 16 15:03:36.012444 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:36.012179 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:03:36.029418 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:03:36.029377 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" podStartSLOduration=2.00611795 podStartE2EDuration="5.029364293s" podCreationTimestamp="2026-04-16 15:03:31 +0000 UTC" firstStartedPulling="2026-04-16 15:03:32.428070764 +0000 UTC m=+682.797353215" lastFinishedPulling="2026-04-16 15:03:35.451317105 +0000 UTC m=+685.820599558" observedRunningTime="2026-04-16 15:03:36.02740251 +0000 UTC m=+686.396684981" watchObservedRunningTime="2026-04-16 15:03:36.029364293 +0000 UTC m=+686.398646764" Apr 16 15:04:07.019757 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:07.019680 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:04:08.116134 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.116102 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-pvs8k"] Apr 16 15:04:08.116526 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.116300 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" podUID="ab76072a-d1e3-4b63-a234-fed084c535b0" containerName="manager" containerID="cri-o://c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd" gracePeriod=10 Apr 16 15:04:08.138516 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.138495 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-cs48h"] Apr 16 15:04:08.140286 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.140271 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:08.150698 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.150675 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-cs48h"] Apr 16 15:04:08.169413 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.169390 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e25beb85-308f-4736-abc2-ce6cbe14a1cc-cert\") pod \"kserve-controller-manager-7669bdc57-cs48h\" (UID: \"e25beb85-308f-4736-abc2-ce6cbe14a1cc\") " pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:08.169501 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.169420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4phs\" (UniqueName: \"kubernetes.io/projected/e25beb85-308f-4736-abc2-ce6cbe14a1cc-kube-api-access-c4phs\") pod \"kserve-controller-manager-7669bdc57-cs48h\" (UID: \"e25beb85-308f-4736-abc2-ce6cbe14a1cc\") " pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:08.270722 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.270696 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4phs\" (UniqueName: \"kubernetes.io/projected/e25beb85-308f-4736-abc2-ce6cbe14a1cc-kube-api-access-c4phs\") pod \"kserve-controller-manager-7669bdc57-cs48h\" (UID: \"e25beb85-308f-4736-abc2-ce6cbe14a1cc\") " pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:08.270824 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.270768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e25beb85-308f-4736-abc2-ce6cbe14a1cc-cert\") pod \"kserve-controller-manager-7669bdc57-cs48h\" (UID: \"e25beb85-308f-4736-abc2-ce6cbe14a1cc\") " pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:08.272897 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.272860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e25beb85-308f-4736-abc2-ce6cbe14a1cc-cert\") pod \"kserve-controller-manager-7669bdc57-cs48h\" (UID: \"e25beb85-308f-4736-abc2-ce6cbe14a1cc\") " pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:08.278939 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.278894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4phs\" (UniqueName: \"kubernetes.io/projected/e25beb85-308f-4736-abc2-ce6cbe14a1cc-kube-api-access-c4phs\") pod \"kserve-controller-manager-7669bdc57-cs48h\" (UID: \"e25beb85-308f-4736-abc2-ce6cbe14a1cc\") " pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:08.350256 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.350237 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:04:08.371640 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.371315 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert\") pod \"ab76072a-d1e3-4b63-a234-fed084c535b0\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " Apr 16 15:04:08.371640 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.371367 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/ab76072a-d1e3-4b63-a234-fed084c535b0-kube-api-access-kf2bb\") pod \"ab76072a-d1e3-4b63-a234-fed084c535b0\" (UID: \"ab76072a-d1e3-4b63-a234-fed084c535b0\") " Apr 16 15:04:08.373816 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.373790 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert" (OuterVolumeSpecName: "cert") pod "ab76072a-d1e3-4b63-a234-fed084c535b0" (UID: "ab76072a-d1e3-4b63-a234-fed084c535b0"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:04:08.373897 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.373807 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab76072a-d1e3-4b63-a234-fed084c535b0-kube-api-access-kf2bb" (OuterVolumeSpecName: "kube-api-access-kf2bb") pod "ab76072a-d1e3-4b63-a234-fed084c535b0" (UID: "ab76072a-d1e3-4b63-a234-fed084c535b0"). InnerVolumeSpecName "kube-api-access-kf2bb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:04:08.472709 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.472684 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab76072a-d1e3-4b63-a234-fed084c535b0-cert\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 15:04:08.472709 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.472705 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kf2bb\" (UniqueName: \"kubernetes.io/projected/ab76072a-d1e3-4b63-a234-fed084c535b0-kube-api-access-kf2bb\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 15:04:08.481724 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.481707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:08.588247 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:08.588201 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-cs48h"] Apr 16 15:04:08.590769 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:04:08.590745 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode25beb85_308f_4736_abc2_ce6cbe14a1cc.slice/crio-0e95f33c5d11a0ea7dc0cbb00fe33e4a2aeac8b544829f24558b094cfa28597d WatchSource:0}: Error finding container 0e95f33c5d11a0ea7dc0cbb00fe33e4a2aeac8b544829f24558b094cfa28597d: Status 404 returned error can't find the container with id 0e95f33c5d11a0ea7dc0cbb00fe33e4a2aeac8b544829f24558b094cfa28597d Apr 16 15:04:09.090871 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.090837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-cs48h" event={"ID":"e25beb85-308f-4736-abc2-ce6cbe14a1cc","Type":"ContainerStarted","Data":"6a8fc52e7d32ff27241929f4b4992c670c2e2d9e742e992e0911e40340214cc7"} Apr 16 15:04:09.091060 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.090880 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:09.091060 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.090893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-cs48h" event={"ID":"e25beb85-308f-4736-abc2-ce6cbe14a1cc","Type":"ContainerStarted","Data":"0e95f33c5d11a0ea7dc0cbb00fe33e4a2aeac8b544829f24558b094cfa28597d"} Apr 16 15:04:09.091842 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.091823 2575 generic.go:358] "Generic (PLEG): container finished" podID="ab76072a-d1e3-4b63-a234-fed084c535b0" containerID="c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd" exitCode=0 Apr 16 15:04:09.091921 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.091854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" event={"ID":"ab76072a-d1e3-4b63-a234-fed084c535b0","Type":"ContainerDied","Data":"c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd"} Apr 16 15:04:09.091921 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.091877 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" Apr 16 15:04:09.091921 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.091891 2575 scope.go:117] "RemoveContainer" containerID="c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd" Apr 16 15:04:09.092032 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.091881 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7669bdc57-pvs8k" event={"ID":"ab76072a-d1e3-4b63-a234-fed084c535b0","Type":"ContainerDied","Data":"54f4d33d0321ad65728adaa723e9cdb418d0c7a203f7e5cadfc06c29e3a63f6e"} Apr 16 15:04:09.099990 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.099968 2575 scope.go:117] "RemoveContainer" containerID="c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd" Apr 16 15:04:09.100258 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:04:09.100233 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd\": container with ID starting with c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd not found: ID does not exist" containerID="c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd" Apr 16 15:04:09.100319 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.100267 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd"} err="failed to get container status \"c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd\": rpc error: code = NotFound desc = could not find container \"c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd\": container with ID starting with c3a34f84573e70f00b3a54f2686b3d8bbbf3df142de554d634913b4b0be8eedd not found: ID does not exist" Apr 16 15:04:09.106341 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.106305 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7669bdc57-cs48h" podStartSLOduration=0.834979055 podStartE2EDuration="1.106293788s" podCreationTimestamp="2026-04-16 15:04:08 +0000 UTC" firstStartedPulling="2026-04-16 15:04:08.591984822 +0000 UTC m=+718.961267274" lastFinishedPulling="2026-04-16 15:04:08.863299553 +0000 UTC m=+719.232582007" observedRunningTime="2026-04-16 15:04:09.105140083 +0000 UTC m=+719.474422555" watchObservedRunningTime="2026-04-16 15:04:09.106293788 +0000 UTC m=+719.475576304" Apr 16 15:04:09.118686 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.118641 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-pvs8k"] Apr 16 15:04:09.120833 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:09.120813 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7669bdc57-pvs8k"] Apr 16 15:04:10.204587 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:10.204557 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab76072a-d1e3-4b63-a234-fed084c535b0" path="/var/lib/kubelet/pods/ab76072a-d1e3-4b63-a234-fed084c535b0/volumes" Apr 16 15:04:40.099372 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:40.099342 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7669bdc57-cs48h" Apr 16 15:04:41.032728 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.032698 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-vgvrp"] Apr 16 15:04:41.032965 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.032915 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab76072a-d1e3-4b63-a234-fed084c535b0" containerName="manager" Apr 16 15:04:41.032965 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.032926 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab76072a-d1e3-4b63-a234-fed084c535b0" containerName="manager" Apr 16 15:04:41.033039 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.032971 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab76072a-d1e3-4b63-a234-fed084c535b0" containerName="manager" Apr 16 15:04:41.035695 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.035674 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.038345 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.038316 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 15:04:41.038442 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.038387 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-9565c\"" Apr 16 15:04:41.043258 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.043236 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-cjjf2"] Apr 16 15:04:41.046085 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.046067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:41.046586 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.046568 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-vgvrp"] Apr 16 15:04:41.048476 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.048455 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 15:04:41.048564 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.048503 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-m6bzr\"" Apr 16 15:04:41.059955 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.059931 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-cjjf2"] Apr 16 15:04:41.077267 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.077250 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tm48\" (UniqueName: \"kubernetes.io/projected/6bb090bd-89f7-4a8b-a80d-d41294f7df52-kube-api-access-6tm48\") pod \"model-serving-api-86f7b4b499-vgvrp\" (UID: \"6bb090bd-89f7-4a8b-a80d-d41294f7df52\") " pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.077356 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.077282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb090bd-89f7-4a8b-a80d-d41294f7df52-tls-certs\") pod \"model-serving-api-86f7b4b499-vgvrp\" (UID: \"6bb090bd-89f7-4a8b-a80d-d41294f7df52\") " pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.077356 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.077301 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bcff461-ab8a-4d95-8e77-101e852f5b21-cert\") pod \"odh-model-controller-696fc77849-cjjf2\" (UID: \"8bcff461-ab8a-4d95-8e77-101e852f5b21\") " pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:41.077432 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.077387 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7fgl\" (UniqueName: \"kubernetes.io/projected/8bcff461-ab8a-4d95-8e77-101e852f5b21-kube-api-access-s7fgl\") pod \"odh-model-controller-696fc77849-cjjf2\" (UID: \"8bcff461-ab8a-4d95-8e77-101e852f5b21\") " pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:41.178179 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.178155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tm48\" (UniqueName: \"kubernetes.io/projected/6bb090bd-89f7-4a8b-a80d-d41294f7df52-kube-api-access-6tm48\") pod \"model-serving-api-86f7b4b499-vgvrp\" (UID: \"6bb090bd-89f7-4a8b-a80d-d41294f7df52\") " pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.178462 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.178192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb090bd-89f7-4a8b-a80d-d41294f7df52-tls-certs\") pod \"model-serving-api-86f7b4b499-vgvrp\" (UID: \"6bb090bd-89f7-4a8b-a80d-d41294f7df52\") " pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.178462 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.178226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bcff461-ab8a-4d95-8e77-101e852f5b21-cert\") pod \"odh-model-controller-696fc77849-cjjf2\" (UID: \"8bcff461-ab8a-4d95-8e77-101e852f5b21\") " pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:41.178462 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.178266 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7fgl\" (UniqueName: \"kubernetes.io/projected/8bcff461-ab8a-4d95-8e77-101e852f5b21-kube-api-access-s7fgl\") pod \"odh-model-controller-696fc77849-cjjf2\" (UID: \"8bcff461-ab8a-4d95-8e77-101e852f5b21\") " pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:41.178462 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:04:41.178318 2575 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 15:04:41.178462 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:04:41.178377 2575 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 15:04:41.178462 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:04:41.178379 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb090bd-89f7-4a8b-a80d-d41294f7df52-tls-certs podName:6bb090bd-89f7-4a8b-a80d-d41294f7df52 nodeName:}" failed. No retries permitted until 2026-04-16 15:04:41.678361192 +0000 UTC m=+752.047643657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/6bb090bd-89f7-4a8b-a80d-d41294f7df52-tls-certs") pod "model-serving-api-86f7b4b499-vgvrp" (UID: "6bb090bd-89f7-4a8b-a80d-d41294f7df52") : secret "model-serving-api-tls" not found Apr 16 15:04:41.178462 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:04:41.178435 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bcff461-ab8a-4d95-8e77-101e852f5b21-cert podName:8bcff461-ab8a-4d95-8e77-101e852f5b21 nodeName:}" failed. No retries permitted until 2026-04-16 15:04:41.678420117 +0000 UTC m=+752.047702570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8bcff461-ab8a-4d95-8e77-101e852f5b21-cert") pod "odh-model-controller-696fc77849-cjjf2" (UID: "8bcff461-ab8a-4d95-8e77-101e852f5b21") : secret "odh-model-controller-webhook-cert" not found Apr 16 15:04:41.191795 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.191772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tm48\" (UniqueName: \"kubernetes.io/projected/6bb090bd-89f7-4a8b-a80d-d41294f7df52-kube-api-access-6tm48\") pod \"model-serving-api-86f7b4b499-vgvrp\" (UID: \"6bb090bd-89f7-4a8b-a80d-d41294f7df52\") " pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.191889 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.191796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7fgl\" (UniqueName: \"kubernetes.io/projected/8bcff461-ab8a-4d95-8e77-101e852f5b21-kube-api-access-s7fgl\") pod \"odh-model-controller-696fc77849-cjjf2\" (UID: \"8bcff461-ab8a-4d95-8e77-101e852f5b21\") " pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:41.682461 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.682425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb090bd-89f7-4a8b-a80d-d41294f7df52-tls-certs\") pod \"model-serving-api-86f7b4b499-vgvrp\" (UID: \"6bb090bd-89f7-4a8b-a80d-d41294f7df52\") " pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.682461 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.682465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bcff461-ab8a-4d95-8e77-101e852f5b21-cert\") pod \"odh-model-controller-696fc77849-cjjf2\" (UID: \"8bcff461-ab8a-4d95-8e77-101e852f5b21\") " pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:41.684692 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.684666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb090bd-89f7-4a8b-a80d-d41294f7df52-tls-certs\") pod \"model-serving-api-86f7b4b499-vgvrp\" (UID: \"6bb090bd-89f7-4a8b-a80d-d41294f7df52\") " pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.684791 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.684737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8bcff461-ab8a-4d95-8e77-101e852f5b21-cert\") pod \"odh-model-controller-696fc77849-cjjf2\" (UID: \"8bcff461-ab8a-4d95-8e77-101e852f5b21\") " pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:41.945855 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.945798 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:41.954883 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:41.954859 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:42.064161 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:42.064137 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-vgvrp"] Apr 16 15:04:42.066200 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:04:42.066166 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb090bd_89f7_4a8b_a80d_d41294f7df52.slice/crio-8380b2ad39d0e532779f51470bf21352aec79364e4be04771605737c23852fad WatchSource:0}: Error finding container 8380b2ad39d0e532779f51470bf21352aec79364e4be04771605737c23852fad: Status 404 returned error can't find the container with id 8380b2ad39d0e532779f51470bf21352aec79364e4be04771605737c23852fad Apr 16 15:04:42.083823 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:42.083798 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-cjjf2"] Apr 16 15:04:42.086776 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:04:42.086752 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bcff461_ab8a_4d95_8e77_101e852f5b21.slice/crio-349a3c0a783e4fb2493f94135ec3480da1d2d30d1e345885ef0b38301817dc2e WatchSource:0}: Error finding container 349a3c0a783e4fb2493f94135ec3480da1d2d30d1e345885ef0b38301817dc2e: Status 404 returned error can't find the container with id 349a3c0a783e4fb2493f94135ec3480da1d2d30d1e345885ef0b38301817dc2e Apr 16 15:04:42.171974 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:42.171945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-cjjf2" event={"ID":"8bcff461-ab8a-4d95-8e77-101e852f5b21","Type":"ContainerStarted","Data":"349a3c0a783e4fb2493f94135ec3480da1d2d30d1e345885ef0b38301817dc2e"} Apr 16 15:04:42.172814 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:42.172793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-vgvrp" event={"ID":"6bb090bd-89f7-4a8b-a80d-d41294f7df52","Type":"ContainerStarted","Data":"8380b2ad39d0e532779f51470bf21352aec79364e4be04771605737c23852fad"} Apr 16 15:04:44.180552 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:44.180517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-vgvrp" event={"ID":"6bb090bd-89f7-4a8b-a80d-d41294f7df52","Type":"ContainerStarted","Data":"7d277a4dde2cef99eda05f86ab72c2c4ebeb2d35671c6da70691dccc4a8b1dbb"} Apr 16 15:04:44.180909 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:44.180633 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:44.198296 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:44.198235 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-vgvrp" podStartSLOduration=1.830531868 podStartE2EDuration="3.198201491s" podCreationTimestamp="2026-04-16 15:04:41 +0000 UTC" firstStartedPulling="2026-04-16 15:04:42.067998552 +0000 UTC m=+752.437281003" lastFinishedPulling="2026-04-16 15:04:43.435668163 +0000 UTC m=+753.804950626" observedRunningTime="2026-04-16 15:04:44.19634259 +0000 UTC m=+754.565625061" watchObservedRunningTime="2026-04-16 15:04:44.198201491 +0000 UTC m=+754.567483963" Apr 16 15:04:45.184608 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:45.184571 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-cjjf2" event={"ID":"8bcff461-ab8a-4d95-8e77-101e852f5b21","Type":"ContainerStarted","Data":"eac7d650fa8d3a84086d74481a98234bed11c11025628efec3b9c0d0b01bec21"} Apr 16 15:04:45.184990 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:45.184696 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:04:45.201098 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:45.201059 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-cjjf2" podStartSLOduration=1.549616446 podStartE2EDuration="4.20104776s" podCreationTimestamp="2026-04-16 15:04:41 +0000 UTC" firstStartedPulling="2026-04-16 15:04:42.087656614 +0000 UTC m=+752.456939065" lastFinishedPulling="2026-04-16 15:04:44.739087922 +0000 UTC m=+755.108370379" observedRunningTime="2026-04-16 15:04:45.199352275 +0000 UTC m=+755.568634758" watchObservedRunningTime="2026-04-16 15:04:45.20104776 +0000 UTC m=+755.570330231" Apr 16 15:04:55.189378 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:55.189351 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-vgvrp" Apr 16 15:04:56.189712 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:04:56.189682 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-cjjf2" Apr 16 15:05:16.950004 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:16.949973 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp"] Apr 16 15:05:16.954369 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:16.954350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" Apr 16 15:05:16.957049 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:16.957032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5c2zz\"" Apr 16 15:05:16.960151 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:16.960125 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp"] Apr 16 15:05:16.965045 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:16.965031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" Apr 16 15:05:17.079175 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.079148 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp"] Apr 16 15:05:17.081831 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:05:17.081806 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod735a135c_bde4_4088_bc69_245eb1b4ac4d.slice/crio-5e27afc3dc89232e3d7ce3949a342978b97661b13c6cd9a6066dd5471576ae70 WatchSource:0}: Error finding container 5e27afc3dc89232e3d7ce3949a342978b97661b13c6cd9a6066dd5471576ae70: Status 404 returned error can't find the container with id 5e27afc3dc89232e3d7ce3949a342978b97661b13c6cd9a6066dd5471576ae70 Apr 16 15:05:17.272984 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.272903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" event={"ID":"735a135c-bde4-4088-bc69-245eb1b4ac4d","Type":"ContainerStarted","Data":"5e27afc3dc89232e3d7ce3949a342978b97661b13c6cd9a6066dd5471576ae70"} Apr 16 15:05:17.360281 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.360240 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph"] Apr 16 15:05:17.362881 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.362862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:05:17.372461 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.372439 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph"] Apr 16 15:05:17.428276 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.428252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db6ba3de-22ea-4778-9d79-832c269e7aee-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-5hnph\" (UID: \"db6ba3de-22ea-4778-9d79-832c269e7aee\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:05:17.518472 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.518447 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc"] Apr 16 15:05:17.520729 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.520709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:05:17.529603 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.529344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db6ba3de-22ea-4778-9d79-832c269e7aee-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-5hnph\" (UID: \"db6ba3de-22ea-4778-9d79-832c269e7aee\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:05:17.530949 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.530332 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc"] Apr 16 15:05:17.530949 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.530803 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db6ba3de-22ea-4778-9d79-832c269e7aee-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-5hnph\" (UID: \"db6ba3de-22ea-4778-9d79-832c269e7aee\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:05:17.630248 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.630189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4006d54-acd0-4f3f-a36b-cb21e72133ff-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-654489c865-9qjkc\" (UID: \"a4006d54-acd0-4f3f-a36b-cb21e72133ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:05:17.673986 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.673957 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:05:17.731991 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.731618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4006d54-acd0-4f3f-a36b-cb21e72133ff-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-654489c865-9qjkc\" (UID: \"a4006d54-acd0-4f3f-a36b-cb21e72133ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:05:17.732847 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.732045 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4006d54-acd0-4f3f-a36b-cb21e72133ff-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-654489c865-9qjkc\" (UID: \"a4006d54-acd0-4f3f-a36b-cb21e72133ff\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:05:17.834165 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.833725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:05:17.847554 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:17.847470 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph"] Apr 16 15:05:17.852270 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:05:17.852235 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb6ba3de_22ea_4778_9d79_832c269e7aee.slice/crio-877ad3e15dae657b897a874a8d7d0025cabd6736356b3aaab9915c5b8a973927 WatchSource:0}: Error finding container 877ad3e15dae657b897a874a8d7d0025cabd6736356b3aaab9915c5b8a973927: Status 404 returned error can't find the container with id 877ad3e15dae657b897a874a8d7d0025cabd6736356b3aaab9915c5b8a973927 Apr 16 15:05:18.011255 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:18.011203 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc"] Apr 16 15:05:18.278137 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:18.278089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" event={"ID":"db6ba3de-22ea-4778-9d79-832c269e7aee","Type":"ContainerStarted","Data":"877ad3e15dae657b897a874a8d7d0025cabd6736356b3aaab9915c5b8a973927"} Apr 16 15:05:18.279594 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:18.279551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" event={"ID":"a4006d54-acd0-4f3f-a36b-cb21e72133ff","Type":"ContainerStarted","Data":"0e90081fa3672f360c05b69197b10a6a4f5669f40601f7ab7354b967959eebe4"} Apr 16 15:05:30.322345 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:30.322248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" event={"ID":"db6ba3de-22ea-4778-9d79-832c269e7aee","Type":"ContainerStarted","Data":"6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43"} Apr 16 15:05:30.323547 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:30.323518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" event={"ID":"a4006d54-acd0-4f3f-a36b-cb21e72133ff","Type":"ContainerStarted","Data":"d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d"} Apr 16 15:05:30.324908 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:30.324885 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" event={"ID":"735a135c-bde4-4088-bc69-245eb1b4ac4d","Type":"ContainerStarted","Data":"9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02"} Apr 16 15:05:30.325108 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:30.325095 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" Apr 16 15:05:30.326308 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:30.326282 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:05:30.367155 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:30.367097 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" podStartSLOduration=1.4229081190000001 podStartE2EDuration="14.367078823s" podCreationTimestamp="2026-04-16 15:05:16 +0000 UTC" firstStartedPulling="2026-04-16 15:05:17.083529233 +0000 UTC m=+787.452811683" lastFinishedPulling="2026-04-16 15:05:30.027699923 +0000 UTC m=+800.396982387" observedRunningTime="2026-04-16 15:05:30.366043719 +0000 UTC m=+800.735326193" watchObservedRunningTime="2026-04-16 15:05:30.367078823 +0000 UTC m=+800.736361297" Apr 16 15:05:31.327761 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:31.327720 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:05:34.336234 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:34.336139 2575 generic.go:358] "Generic (PLEG): container finished" podID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerID="6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43" exitCode=0 Apr 16 15:05:34.336627 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:34.336247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" event={"ID":"db6ba3de-22ea-4778-9d79-832c269e7aee","Type":"ContainerDied","Data":"6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43"} Apr 16 15:05:34.337810 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:34.337765 2575 generic.go:358] "Generic (PLEG): container finished" podID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerID="d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d" exitCode=0 Apr 16 15:05:34.337892 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:34.337830 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" event={"ID":"a4006d54-acd0-4f3f-a36b-cb21e72133ff","Type":"ContainerDied","Data":"d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d"} Apr 16 15:05:41.328647 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:41.328591 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:05:42.368187 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:42.368144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" event={"ID":"a4006d54-acd0-4f3f-a36b-cb21e72133ff","Type":"ContainerStarted","Data":"085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407"} Apr 16 15:05:42.368598 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:42.368445 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:05:42.370115 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:42.370068 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:05:42.384020 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:42.383969 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podStartSLOduration=1.366346363 podStartE2EDuration="25.38395325s" podCreationTimestamp="2026-04-16 15:05:17 +0000 UTC" firstStartedPulling="2026-04-16 15:05:18.017657942 +0000 UTC m=+788.386940397" lastFinishedPulling="2026-04-16 15:05:42.035264821 +0000 UTC m=+812.404547284" observedRunningTime="2026-04-16 15:05:42.3829281 +0000 UTC m=+812.752210597" watchObservedRunningTime="2026-04-16 15:05:42.38395325 +0000 UTC m=+812.753235722" Apr 16 15:05:43.370710 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:43.370671 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:05:51.328859 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:51.328816 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:05:53.371178 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:53.371134 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:05:53.402481 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:53.402452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" event={"ID":"db6ba3de-22ea-4778-9d79-832c269e7aee","Type":"ContainerStarted","Data":"0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e"} Apr 16 15:05:53.402756 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:53.402737 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:05:53.404075 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:53.404051 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:05:53.418363 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:53.418317 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podStartSLOduration=1.8219230560000002 podStartE2EDuration="36.418307128s" podCreationTimestamp="2026-04-16 15:05:17 +0000 UTC" firstStartedPulling="2026-04-16 15:05:17.85675248 +0000 UTC m=+788.226034943" lastFinishedPulling="2026-04-16 15:05:52.453136565 +0000 UTC m=+822.822419015" observedRunningTime="2026-04-16 15:05:53.417002541 +0000 UTC m=+823.786285011" watchObservedRunningTime="2026-04-16 15:05:53.418307128 +0000 UTC m=+823.787589599" Apr 16 15:05:54.405198 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:05:54.405155 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:06:01.328624 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:01.328581 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:06:03.371351 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:03.371309 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:06:04.405893 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:04.405850 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:06:11.328301 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:11.328255 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 16 15:06:13.370859 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:13.370819 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:06:14.405269 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:14.405206 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:06:21.328964 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:21.328923 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" Apr 16 15:06:23.371284 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:23.371244 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:06:24.405639 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:24.405598 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:06:33.371055 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:33.371017 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:06:34.405806 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:34.405768 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:06:43.371558 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:43.371519 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 16 15:06:44.405393 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:44.405352 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 16 15:06:47.306032 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.305230 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp"] Apr 16 15:06:47.306032 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.305631 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" containerID="cri-o://9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02" gracePeriod=30 Apr 16 15:06:47.357362 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.357332 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf"] Apr 16 15:06:47.359204 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.359189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" Apr 16 15:06:47.367396 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.367372 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf"] Apr 16 15:06:47.368625 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.368600 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" Apr 16 15:06:47.480064 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.480032 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf"] Apr 16 15:06:47.483296 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:06:47.483270 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1c7f898_5967_469f_be6b_a239528e1b81.slice/crio-5aeb9687d6458b88fdecc6dcaeccf1ac18ae105951f18857882f35af4a057474 WatchSource:0}: Error finding container 5aeb9687d6458b88fdecc6dcaeccf1ac18ae105951f18857882f35af4a057474: Status 404 returned error can't find the container with id 5aeb9687d6458b88fdecc6dcaeccf1ac18ae105951f18857882f35af4a057474 Apr 16 15:06:47.549735 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.549708 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" event={"ID":"c1c7f898-5967-469f-be6b-a239528e1b81","Type":"ContainerStarted","Data":"923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f"} Apr 16 15:06:47.549841 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.549746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" event={"ID":"c1c7f898-5967-469f-be6b-a239528e1b81","Type":"ContainerStarted","Data":"5aeb9687d6458b88fdecc6dcaeccf1ac18ae105951f18857882f35af4a057474"} Apr 16 15:06:47.549887 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.549850 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" Apr 16 15:06:47.550944 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.550921 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:06:47.563541 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:47.563480 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" podStartSLOduration=0.563468205 podStartE2EDuration="563.468205ms" podCreationTimestamp="2026-04-16 15:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:06:47.562514793 +0000 UTC m=+877.931797266" watchObservedRunningTime="2026-04-16 15:06:47.563468205 +0000 UTC m=+877.932750676" Apr 16 15:06:48.552400 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:48.552368 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:06:50.140064 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.140042 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" Apr 16 15:06:50.557960 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.557879 2575 generic.go:358] "Generic (PLEG): container finished" podID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerID="9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02" exitCode=0 Apr 16 15:06:50.557960 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.557915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" event={"ID":"735a135c-bde4-4088-bc69-245eb1b4ac4d","Type":"ContainerDied","Data":"9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02"} Apr 16 15:06:50.557960 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.557938 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" event={"ID":"735a135c-bde4-4088-bc69-245eb1b4ac4d","Type":"ContainerDied","Data":"5e27afc3dc89232e3d7ce3949a342978b97661b13c6cd9a6066dd5471576ae70"} Apr 16 15:06:50.557960 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.557937 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp" Apr 16 15:06:50.557960 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.557949 2575 scope.go:117] "RemoveContainer" containerID="9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02" Apr 16 15:06:50.565385 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.565365 2575 scope.go:117] "RemoveContainer" containerID="9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02" Apr 16 15:06:50.565620 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:06:50.565605 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02\": container with ID starting with 9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02 not found: ID does not exist" containerID="9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02" Apr 16 15:06:50.565664 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.565628 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02"} err="failed to get container status \"9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02\": rpc error: code = NotFound desc = could not find container \"9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02\": container with ID starting with 9e9579290bb9179b19c56fcaaa8d815dc0b722c1fb86908b80505cafd37fdc02 not found: ID does not exist" Apr 16 15:06:50.572644 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.572619 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp"] Apr 16 15:06:50.575154 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:50.575137 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1f99-predictor-79c6d7db55-qjzfp"] Apr 16 15:06:52.204744 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:52.204712 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" path="/var/lib/kubelet/pods/735a135c-bde4-4088-bc69-245eb1b4ac4d/volumes" Apr 16 15:06:53.372462 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:53.372427 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:06:54.406096 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:54.406068 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:06:58.552627 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:06:58.552545 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:07:08.552682 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:08.552640 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:07:10.129880 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:10.129846 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:07:10.131889 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:10.131859 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:07:18.552891 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:18.552849 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:07:27.257249 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.257204 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh"] Apr 16 15:07:27.257601 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.257475 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" Apr 16 15:07:27.257601 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.257486 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" Apr 16 15:07:27.257601 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.257541 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="735a135c-bde4-4088-bc69-245eb1b4ac4d" containerName="kserve-container" Apr 16 15:07:27.260099 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.260081 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" Apr 16 15:07:27.269422 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.269401 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh"] Apr 16 15:07:27.269633 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.269613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" Apr 16 15:07:27.330531 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.330113 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph"] Apr 16 15:07:27.330531 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.330406 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" containerID="cri-o://0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e" gracePeriod=30 Apr 16 15:07:27.411765 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.411738 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh"] Apr 16 15:07:27.414671 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:07:27.414632 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod285687a6_4d01_478f_bd88_043e9c2a35a8.slice/crio-a53f5ab0c11e976816bc4aea198f0d76935f1619e2477791717767ae981d36a5 WatchSource:0}: Error finding container a53f5ab0c11e976816bc4aea198f0d76935f1619e2477791717767ae981d36a5: Status 404 returned error can't find the container with id a53f5ab0c11e976816bc4aea198f0d76935f1619e2477791717767ae981d36a5 Apr 16 15:07:27.472524 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.472497 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc"] Apr 16 15:07:27.472771 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.472735 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" containerID="cri-o://085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407" gracePeriod=30 Apr 16 15:07:27.655738 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.655704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" event={"ID":"285687a6-4d01-478f-bd88-043e9c2a35a8","Type":"ContainerStarted","Data":"4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799"} Apr 16 15:07:27.655738 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.655738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" event={"ID":"285687a6-4d01-478f-bd88-043e9c2a35a8","Type":"ContainerStarted","Data":"a53f5ab0c11e976816bc4aea198f0d76935f1619e2477791717767ae981d36a5"} Apr 16 15:07:27.655932 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.655873 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" Apr 16 15:07:27.657035 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.657007 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:07:27.670464 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:27.670428 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" podStartSLOduration=0.670416793 podStartE2EDuration="670.416793ms" podCreationTimestamp="2026-04-16 15:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:07:27.668842711 +0000 UTC m=+918.038125183" watchObservedRunningTime="2026-04-16 15:07:27.670416793 +0000 UTC m=+918.039699264" Apr 16 15:07:28.552509 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:28.552469 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 16 15:07:28.658658 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:28.658621 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:07:31.174427 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.174399 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:07:31.263605 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.263577 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db6ba3de-22ea-4778-9d79-832c269e7aee-kserve-provision-location\") pod \"db6ba3de-22ea-4778-9d79-832c269e7aee\" (UID: \"db6ba3de-22ea-4778-9d79-832c269e7aee\") " Apr 16 15:07:31.263865 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.263845 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db6ba3de-22ea-4778-9d79-832c269e7aee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "db6ba3de-22ea-4778-9d79-832c269e7aee" (UID: "db6ba3de-22ea-4778-9d79-832c269e7aee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:07:31.364935 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.364910 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/db6ba3de-22ea-4778-9d79-832c269e7aee-kserve-provision-location\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 15:07:31.668387 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.668356 2575 generic.go:358] "Generic (PLEG): container finished" podID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerID="0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e" exitCode=0 Apr 16 15:07:31.668496 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.668412 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" event={"ID":"db6ba3de-22ea-4778-9d79-832c269e7aee","Type":"ContainerDied","Data":"0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e"} Apr 16 15:07:31.668496 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.668423 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" Apr 16 15:07:31.668496 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.668449 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph" event={"ID":"db6ba3de-22ea-4778-9d79-832c269e7aee","Type":"ContainerDied","Data":"877ad3e15dae657b897a874a8d7d0025cabd6736356b3aaab9915c5b8a973927"} Apr 16 15:07:31.668496 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.668470 2575 scope.go:117] "RemoveContainer" containerID="0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e" Apr 16 15:07:31.706779 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.706757 2575 scope.go:117] "RemoveContainer" containerID="6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43" Apr 16 15:07:31.713734 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.713716 2575 scope.go:117] "RemoveContainer" containerID="0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e" Apr 16 15:07:31.713944 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:07:31.713928 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e\": container with ID starting with 0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e not found: ID does not exist" containerID="0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e" Apr 16 15:07:31.713980 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.713955 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e"} err="failed to get container status \"0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e\": rpc error: code = NotFound desc = could not find container \"0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e\": container with ID starting with 0eb20ad7d02c6f87f723d9a4bd9cc8173728489c39c230d3fcee0231f015ea4e not found: ID does not exist" Apr 16 15:07:31.714019 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.713979 2575 scope.go:117] "RemoveContainer" containerID="6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43" Apr 16 15:07:31.714183 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:07:31.714156 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43\": container with ID starting with 6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43 not found: ID does not exist" containerID="6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43" Apr 16 15:07:31.714232 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.714196 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43"} err="failed to get container status \"6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43\": rpc error: code = NotFound desc = could not find container \"6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43\": container with ID starting with 6a8cf3e6cb6d0ac4e1fef988909bfcfd29a1b1fd77b5aa42deb3f3e9b1efea43 not found: ID does not exist" Apr 16 15:07:31.718173 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.718153 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph"] Apr 16 15:07:31.722260 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.722240 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-5hnph"] Apr 16 15:07:31.799253 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.799234 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:07:31.868397 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.868372 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4006d54-acd0-4f3f-a36b-cb21e72133ff-kserve-provision-location\") pod \"a4006d54-acd0-4f3f-a36b-cb21e72133ff\" (UID: \"a4006d54-acd0-4f3f-a36b-cb21e72133ff\") " Apr 16 15:07:31.868658 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.868638 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4006d54-acd0-4f3f-a36b-cb21e72133ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a4006d54-acd0-4f3f-a36b-cb21e72133ff" (UID: "a4006d54-acd0-4f3f-a36b-cb21e72133ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:07:31.969727 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:31.969670 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a4006d54-acd0-4f3f-a36b-cb21e72133ff-kserve-provision-location\") on node \"ip-10-0-130-229.ec2.internal\" DevicePath \"\"" Apr 16 15:07:32.205075 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.205042 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" path="/var/lib/kubelet/pods/db6ba3de-22ea-4778-9d79-832c269e7aee/volumes" Apr 16 15:07:32.676643 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.676611 2575 generic.go:358] "Generic (PLEG): container finished" podID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerID="085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407" exitCode=0 Apr 16 15:07:32.676786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.676691 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" Apr 16 15:07:32.676786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.676694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" event={"ID":"a4006d54-acd0-4f3f-a36b-cb21e72133ff","Type":"ContainerDied","Data":"085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407"} Apr 16 15:07:32.676786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.676730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc" event={"ID":"a4006d54-acd0-4f3f-a36b-cb21e72133ff","Type":"ContainerDied","Data":"0e90081fa3672f360c05b69197b10a6a4f5669f40601f7ab7354b967959eebe4"} Apr 16 15:07:32.676786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.676750 2575 scope.go:117] "RemoveContainer" containerID="085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407" Apr 16 15:07:32.683973 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.683958 2575 scope.go:117] "RemoveContainer" containerID="d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d" Apr 16 15:07:32.690350 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.690335 2575 scope.go:117] "RemoveContainer" containerID="085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407" Apr 16 15:07:32.690562 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:07:32.690545 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407\": container with ID starting with 085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407 not found: ID does not exist" containerID="085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407" Apr 16 15:07:32.690606 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.690570 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407"} err="failed to get container status \"085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407\": rpc error: code = NotFound desc = could not find container \"085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407\": container with ID starting with 085e8d6bd737fccaf67869fbb926c127fafb8b7b6b8339457ddbd292f34aa407 not found: ID does not exist" Apr 16 15:07:32.690606 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.690586 2575 scope.go:117] "RemoveContainer" containerID="d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d" Apr 16 15:07:32.690821 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:07:32.690802 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d\": container with ID starting with d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d not found: ID does not exist" containerID="d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d" Apr 16 15:07:32.690872 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.690831 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d"} err="failed to get container status \"d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d\": rpc error: code = NotFound desc = could not find container \"d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d\": container with ID starting with d5686a45775fedac1036e4ec914da92026f1cd489678ae5d8a436453b509376d not found: ID does not exist" Apr 16 15:07:32.693064 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.693044 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc"] Apr 16 15:07:32.696532 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:32.696514 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-654489c865-9qjkc"] Apr 16 15:07:34.204824 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:34.204793 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" path="/var/lib/kubelet/pods/a4006d54-acd0-4f3f-a36b-cb21e72133ff/volumes" Apr 16 15:07:38.554183 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:38.554152 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" Apr 16 15:07:38.658718 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:38.658685 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:07:48.658934 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:48.658896 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:07:58.658744 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:07:58.658698 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:08:08.658876 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:08:08.658834 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 16 15:08:18.660005 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:08:18.659974 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" Apr 16 15:12:10.147231 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:12:10.147181 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:12:10.152425 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:12:10.152406 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:16:12.234039 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.233956 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf"] Apr 16 15:16:12.235495 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.234308 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" containerID="cri-o://923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f" gracePeriod=30 Apr 16 15:16:12.333279 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333252 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98"] Apr 16 15:16:12.333507 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333496 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="storage-initializer" Apr 16 15:16:12.333555 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333509 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="storage-initializer" Apr 16 15:16:12.333555 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333525 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" Apr 16 15:16:12.333555 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333531 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" Apr 16 15:16:12.333555 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333545 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="storage-initializer" Apr 16 15:16:12.333555 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333550 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="storage-initializer" Apr 16 15:16:12.333555 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333556 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" Apr 16 15:16:12.333722 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333561 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" Apr 16 15:16:12.333722 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333601 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4006d54-acd0-4f3f-a36b-cb21e72133ff" containerName="kserve-container" Apr 16 15:16:12.333722 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.333611 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="db6ba3de-22ea-4778-9d79-832c269e7aee" containerName="kserve-container" Apr 16 15:16:12.335251 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.335236 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" Apr 16 15:16:12.344461 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.344441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" Apr 16 15:16:12.348985 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.348962 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98"] Apr 16 15:16:12.466525 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.466492 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98"] Apr 16 15:16:12.469273 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:16:12.469247 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf48f0101_4def_4526_9ed7_56889b09bd6f.slice/crio-de01913949d9619e0757daf4dbd90aae5851c1b2341ae536d1d09cb046fe89ac WatchSource:0}: Error finding container de01913949d9619e0757daf4dbd90aae5851c1b2341ae536d1d09cb046fe89ac: Status 404 returned error can't find the container with id de01913949d9619e0757daf4dbd90aae5851c1b2341ae536d1d09cb046fe89ac Apr 16 15:16:12.470876 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:12.470860 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:16:13.049247 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:13.049197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" event={"ID":"f48f0101-4def-4526-9ed7-56889b09bd6f","Type":"ContainerStarted","Data":"abe33dd5336fd3d545405dcc378cc36072529a262e35f2683224ccde1457caf9"} Apr 16 15:16:13.049247 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:13.049243 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" event={"ID":"f48f0101-4def-4526-9ed7-56889b09bd6f","Type":"ContainerStarted","Data":"de01913949d9619e0757daf4dbd90aae5851c1b2341ae536d1d09cb046fe89ac"} Apr 16 15:16:13.049463 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:13.049430 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" Apr 16 15:16:13.050731 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:13.050706 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:16:13.064803 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:13.064763 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" podStartSLOduration=1.064751407 podStartE2EDuration="1.064751407s" podCreationTimestamp="2026-04-16 15:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:13.063332924 +0000 UTC m=+1443.432615408" watchObservedRunningTime="2026-04-16 15:16:13.064751407 +0000 UTC m=+1443.434033870" Apr 16 15:16:14.052921 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:14.052881 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:16:14.965532 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:14.965508 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" Apr 16 15:16:15.055714 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.055648 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1c7f898-5967-469f-be6b-a239528e1b81" containerID="923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f" exitCode=0 Apr 16 15:16:15.055714 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.055700 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" Apr 16 15:16:15.056095 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.055729 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" event={"ID":"c1c7f898-5967-469f-be6b-a239528e1b81","Type":"ContainerDied","Data":"923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f"} Apr 16 15:16:15.056095 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.055764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf" event={"ID":"c1c7f898-5967-469f-be6b-a239528e1b81","Type":"ContainerDied","Data":"5aeb9687d6458b88fdecc6dcaeccf1ac18ae105951f18857882f35af4a057474"} Apr 16 15:16:15.056095 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.055781 2575 scope.go:117] "RemoveContainer" containerID="923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f" Apr 16 15:16:15.062943 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.062928 2575 scope.go:117] "RemoveContainer" containerID="923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f" Apr 16 15:16:15.063168 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:16:15.063147 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f\": container with ID starting with 923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f not found: ID does not exist" containerID="923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f" Apr 16 15:16:15.063302 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.063179 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f"} err="failed to get container status \"923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f\": rpc error: code = NotFound desc = could not find container \"923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f\": container with ID starting with 923bedc3b0bb90d3e4ff8a3410d35f4322d45243b2eeb2f3bf1791feff64202f not found: ID does not exist" Apr 16 15:16:15.074377 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.074358 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf"] Apr 16 15:16:15.077802 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:15.077782 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c9f8b-predictor-b9f4c48f7-z6skf"] Apr 16 15:16:16.203642 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:16.203602 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" path="/var/lib/kubelet/pods/c1c7f898-5967-469f-be6b-a239528e1b81/volumes" Apr 16 15:16:24.053064 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:24.053025 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:16:34.052978 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:34.052931 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:16:44.053407 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:44.053365 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:16:52.131408 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.131375 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh"] Apr 16 15:16:52.131795 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.131629 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" containerID="cri-o://4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799" gracePeriod=30 Apr 16 15:16:52.152688 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.152665 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g"] Apr 16 15:16:52.152955 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.152943 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" Apr 16 15:16:52.152999 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.152957 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" Apr 16 15:16:52.153031 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.153008 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1c7f898-5967-469f-be6b-a239528e1b81" containerName="kserve-container" Apr 16 15:16:52.154640 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.154619 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" Apr 16 15:16:52.163027 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.163000 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g"] Apr 16 15:16:52.163964 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.163948 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" Apr 16 15:16:52.284949 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:52.284921 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g"] Apr 16 15:16:52.288078 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:16:52.287990 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7661a2c7_d428_401d_8a13_ea94d983720c.slice/crio-36b8e81b7e812ccd00e4e6e7758d3b5fd7310db3a3da9be9c209086663513226 WatchSource:0}: Error finding container 36b8e81b7e812ccd00e4e6e7758d3b5fd7310db3a3da9be9c209086663513226: Status 404 returned error can't find the container with id 36b8e81b7e812ccd00e4e6e7758d3b5fd7310db3a3da9be9c209086663513226 Apr 16 15:16:53.160028 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:53.159987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" event={"ID":"7661a2c7-d428-401d-8a13-ea94d983720c","Type":"ContainerStarted","Data":"4f80b3fc0a457cd7fb0c9d656f1f5ec4aa49dc9dfb5961680e1d5a11bcb47924"} Apr 16 15:16:53.160028 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:53.160034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" event={"ID":"7661a2c7-d428-401d-8a13-ea94d983720c","Type":"ContainerStarted","Data":"36b8e81b7e812ccd00e4e6e7758d3b5fd7310db3a3da9be9c209086663513226"} Apr 16 15:16:53.160465 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:53.160171 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" Apr 16 15:16:53.161526 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:53.161500 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:16:53.174459 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:53.174418 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podStartSLOduration=1.174406205 podStartE2EDuration="1.174406205s" podCreationTimestamp="2026-04-16 15:16:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:16:53.173526866 +0000 UTC m=+1483.542809338" watchObservedRunningTime="2026-04-16 15:16:53.174406205 +0000 UTC m=+1483.543688676" Apr 16 15:16:54.054380 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:54.054349 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" Apr 16 15:16:54.163277 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:54.163238 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:16:58.457342 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:58.457318 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" Apr 16 15:16:59.178691 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.178660 2575 generic.go:358] "Generic (PLEG): container finished" podID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerID="4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799" exitCode=0 Apr 16 15:16:59.178859 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.178705 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" event={"ID":"285687a6-4d01-478f-bd88-043e9c2a35a8","Type":"ContainerDied","Data":"4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799"} Apr 16 15:16:59.178859 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.178714 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" Apr 16 15:16:59.178859 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.178728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh" event={"ID":"285687a6-4d01-478f-bd88-043e9c2a35a8","Type":"ContainerDied","Data":"a53f5ab0c11e976816bc4aea198f0d76935f1619e2477791717767ae981d36a5"} Apr 16 15:16:59.178859 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.178743 2575 scope.go:117] "RemoveContainer" containerID="4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799" Apr 16 15:16:59.185897 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.185878 2575 scope.go:117] "RemoveContainer" containerID="4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799" Apr 16 15:16:59.186251 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:16:59.186230 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799\": container with ID starting with 4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799 not found: ID does not exist" containerID="4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799" Apr 16 15:16:59.186336 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.186262 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799"} err="failed to get container status \"4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799\": rpc error: code = NotFound desc = could not find container \"4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799\": container with ID starting with 4b4d98fd6a7221d3539f1332fd5eb80b3b0e4c3fc533a75c4c9d9f0094765799 not found: ID does not exist" Apr 16 15:16:59.197714 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.197694 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh"] Apr 16 15:16:59.202621 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:16:59.202601 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2d44a-predictor-595d486c67-sz8xh"] Apr 16 15:17:00.204591 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:00.204551 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" path="/var/lib/kubelet/pods/285687a6-4d01-478f-bd88-043e9c2a35a8/volumes" Apr 16 15:17:04.164022 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:04.163986 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:17:10.163967 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:10.163932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:17:10.173861 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:10.173838 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:17:14.163311 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:14.163269 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:17:22.537189 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.537117 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98"] Apr 16 15:17:22.537909 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.537430 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" containerID="cri-o://abe33dd5336fd3d545405dcc378cc36072529a262e35f2683224ccde1457caf9" gracePeriod=30 Apr 16 15:17:22.558759 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.558730 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx"] Apr 16 15:17:22.559010 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.558985 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" Apr 16 15:17:22.559010 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.559000 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" Apr 16 15:17:22.559112 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.559056 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="285687a6-4d01-478f-bd88-043e9c2a35a8" containerName="kserve-container" Apr 16 15:17:22.561149 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.561132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" Apr 16 15:17:22.568048 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.568024 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx"] Apr 16 15:17:22.570994 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.570977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" Apr 16 15:17:22.695868 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:22.695828 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx"] Apr 16 15:17:22.698793 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:17:22.698765 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c987b34_a142_4636_a350_24394176c8a4.slice/crio-7c2ef72cf012578d8aa4118121dbb689c4502053921b3fe99d237165a6583244 WatchSource:0}: Error finding container 7c2ef72cf012578d8aa4118121dbb689c4502053921b3fe99d237165a6583244: Status 404 returned error can't find the container with id 7c2ef72cf012578d8aa4118121dbb689c4502053921b3fe99d237165a6583244 Apr 16 15:17:23.249030 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:23.248992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" event={"ID":"2c987b34-a142-4636-a350-24394176c8a4","Type":"ContainerStarted","Data":"69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08"} Apr 16 15:17:23.249030 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:23.249032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" event={"ID":"2c987b34-a142-4636-a350-24394176c8a4","Type":"ContainerStarted","Data":"7c2ef72cf012578d8aa4118121dbb689c4502053921b3fe99d237165a6583244"} Apr 16 15:17:23.249248 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:23.249165 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" Apr 16 15:17:23.250357 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:23.250331 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:17:23.265385 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:23.265339 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" podStartSLOduration=1.2653269489999999 podStartE2EDuration="1.265326949s" podCreationTimestamp="2026-04-16 15:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:17:23.263783801 +0000 UTC m=+1513.633066291" watchObservedRunningTime="2026-04-16 15:17:23.265326949 +0000 UTC m=+1513.634609420" Apr 16 15:17:24.053279 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:24.053237 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 16 15:17:24.164006 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:24.163969 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:17:24.251409 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:24.251374 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:17:25.255326 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:25.255294 2575 generic.go:358] "Generic (PLEG): container finished" podID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerID="abe33dd5336fd3d545405dcc378cc36072529a262e35f2683224ccde1457caf9" exitCode=0 Apr 16 15:17:25.255691 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:25.255333 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" event={"ID":"f48f0101-4def-4526-9ed7-56889b09bd6f","Type":"ContainerDied","Data":"abe33dd5336fd3d545405dcc378cc36072529a262e35f2683224ccde1457caf9"} Apr 16 15:17:25.375294 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:25.375273 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" Apr 16 15:17:26.258306 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:26.258274 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" event={"ID":"f48f0101-4def-4526-9ed7-56889b09bd6f","Type":"ContainerDied","Data":"de01913949d9619e0757daf4dbd90aae5851c1b2341ae536d1d09cb046fe89ac"} Apr 16 15:17:26.258685 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:26.258315 2575 scope.go:117] "RemoveContainer" containerID="abe33dd5336fd3d545405dcc378cc36072529a262e35f2683224ccde1457caf9" Apr 16 15:17:26.258685 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:26.258318 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98" Apr 16 15:17:26.272985 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:26.272966 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98"] Apr 16 15:17:26.276960 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:26.276941 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-36a90-predictor-749c75bdbc-lnz98"] Apr 16 15:17:28.203921 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:28.203889 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" path="/var/lib/kubelet/pods/f48f0101-4def-4526-9ed7-56889b09bd6f/volumes" Apr 16 15:17:34.163828 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:34.163790 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:17:34.251802 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:34.251773 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:17:44.164366 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:44.164335 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" Apr 16 15:17:44.251921 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:44.251879 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:17:54.252429 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:17:54.252393 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 15:18:04.252902 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:04.252871 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" Apr 16 15:18:12.408476 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.408444 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g"] Apr 16 15:18:12.408904 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.408735 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" containerID="cri-o://4f80b3fc0a457cd7fb0c9d656f1f5ec4aa49dc9dfb5961680e1d5a11bcb47924" gracePeriod=30 Apr 16 15:18:12.448938 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.448908 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595"] Apr 16 15:18:12.449174 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.449163 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" Apr 16 15:18:12.449240 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.449176 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" Apr 16 15:18:12.449240 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.449230 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f48f0101-4def-4526-9ed7-56889b09bd6f" containerName="kserve-container" Apr 16 15:18:12.450998 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.450984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" Apr 16 15:18:12.460541 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.460521 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" Apr 16 15:18:12.460859 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.460844 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595"] Apr 16 15:18:12.580833 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:12.580796 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595"] Apr 16 15:18:12.584335 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:18:12.584303 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22813604_60d6_47d3_baee_bab5f60578ec.slice/crio-7f601834d389749b25a4ff9c4d488c3154cb0c1fef267a80fdaf1e4cb56dfbb2 WatchSource:0}: Error finding container 7f601834d389749b25a4ff9c4d488c3154cb0c1fef267a80fdaf1e4cb56dfbb2: Status 404 returned error can't find the container with id 7f601834d389749b25a4ff9c4d488c3154cb0c1fef267a80fdaf1e4cb56dfbb2 Apr 16 15:18:13.381838 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:13.381791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" event={"ID":"22813604-60d6-47d3-baee-bab5f60578ec","Type":"ContainerStarted","Data":"44614ecf31c3efad7c86fc347f71297f63a0bf828c82ebdf974003a2332fa670"} Apr 16 15:18:13.381838 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:13.381840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" event={"ID":"22813604-60d6-47d3-baee-bab5f60578ec","Type":"ContainerStarted","Data":"7f601834d389749b25a4ff9c4d488c3154cb0c1fef267a80fdaf1e4cb56dfbb2"} Apr 16 15:18:13.382085 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:13.381976 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" Apr 16 15:18:13.383354 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:13.383329 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:18:13.396864 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:13.396822 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" podStartSLOduration=1.396810607 podStartE2EDuration="1.396810607s" podCreationTimestamp="2026-04-16 15:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:18:13.395453138 +0000 UTC m=+1563.764735609" watchObservedRunningTime="2026-04-16 15:18:13.396810607 +0000 UTC m=+1563.766093078" Apr 16 15:18:14.163942 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:14.163905 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 16 15:18:14.384074 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:14.384036 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:18:15.387280 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:15.387252 2575 generic.go:358] "Generic (PLEG): container finished" podID="7661a2c7-d428-401d-8a13-ea94d983720c" containerID="4f80b3fc0a457cd7fb0c9d656f1f5ec4aa49dc9dfb5961680e1d5a11bcb47924" exitCode=0 Apr 16 15:18:15.387580 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:15.387321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" event={"ID":"7661a2c7-d428-401d-8a13-ea94d983720c","Type":"ContainerDied","Data":"4f80b3fc0a457cd7fb0c9d656f1f5ec4aa49dc9dfb5961680e1d5a11bcb47924"} Apr 16 15:18:15.449638 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:15.449621 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" Apr 16 15:18:16.390663 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:16.390627 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" event={"ID":"7661a2c7-d428-401d-8a13-ea94d983720c","Type":"ContainerDied","Data":"36b8e81b7e812ccd00e4e6e7758d3b5fd7310db3a3da9be9c209086663513226"} Apr 16 15:18:16.390663 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:16.390647 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g" Apr 16 15:18:16.391052 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:16.390671 2575 scope.go:117] "RemoveContainer" containerID="4f80b3fc0a457cd7fb0c9d656f1f5ec4aa49dc9dfb5961680e1d5a11bcb47924" Apr 16 15:18:16.405804 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:16.405773 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g"] Apr 16 15:18:16.408946 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:16.408924 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b8642-predictor-89bc8697-8tq5g"] Apr 16 15:18:18.203878 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:18.203834 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" path="/var/lib/kubelet/pods/7661a2c7-d428-401d-8a13-ea94d983720c/volumes" Apr 16 15:18:24.384123 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:24.384085 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:18:34.385030 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:34.384990 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:18:44.384503 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:44.384462 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 16 15:18:54.386125 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:18:54.386095 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" Apr 16 15:22:10.185604 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:22:10.185576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:22:10.192738 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:22:10.192718 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:26:37.477538 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.477507 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx"] Apr 16 15:26:37.478014 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.477723 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" containerID="cri-o://69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08" gracePeriod=30 Apr 16 15:26:37.533989 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.533960 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb"] Apr 16 15:26:37.534367 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.534350 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" Apr 16 15:26:37.534367 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.534372 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" Apr 16 15:26:37.534477 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.534456 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7661a2c7-d428-401d-8a13-ea94d983720c" containerName="kserve-container" Apr 16 15:26:37.537407 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.537390 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" Apr 16 15:26:37.547259 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.547239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" Apr 16 15:26:37.547541 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.547521 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb"] Apr 16 15:26:37.658894 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.658871 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb"] Apr 16 15:26:37.661471 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:26:37.661440 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25744534_c8ff_42e9_bbd1_7a7752f5acdf.slice/crio-8c436d70369ea01a4fa85f4018f0dfde29c385476386421719f161336db47212 WatchSource:0}: Error finding container 8c436d70369ea01a4fa85f4018f0dfde29c385476386421719f161336db47212: Status 404 returned error can't find the container with id 8c436d70369ea01a4fa85f4018f0dfde29c385476386421719f161336db47212 Apr 16 15:26:37.663230 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.663190 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:26:37.719921 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:37.719899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" event={"ID":"25744534-c8ff-42e9-bbd1-7a7752f5acdf","Type":"ContainerStarted","Data":"8c436d70369ea01a4fa85f4018f0dfde29c385476386421719f161336db47212"} Apr 16 15:26:38.723671 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:38.723632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" event={"ID":"25744534-c8ff-42e9-bbd1-7a7752f5acdf","Type":"ContainerStarted","Data":"dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0"} Apr 16 15:26:38.724082 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:38.723818 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" Apr 16 15:26:38.724888 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:38.724866 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:26:38.738517 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:38.738468 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" podStartSLOduration=1.738452905 podStartE2EDuration="1.738452905s" podCreationTimestamp="2026-04-16 15:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:26:38.737007724 +0000 UTC m=+2069.106290197" watchObservedRunningTime="2026-04-16 15:26:38.738452905 +0000 UTC m=+2069.107735378" Apr 16 15:26:39.726388 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:39.726350 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:26:40.017668 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.017646 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" Apr 16 15:26:40.729546 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.729517 2575 generic.go:358] "Generic (PLEG): container finished" podID="2c987b34-a142-4636-a350-24394176c8a4" containerID="69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08" exitCode=0 Apr 16 15:26:40.729992 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.729558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" event={"ID":"2c987b34-a142-4636-a350-24394176c8a4","Type":"ContainerDied","Data":"69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08"} Apr 16 15:26:40.729992 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.729583 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" Apr 16 15:26:40.729992 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.729598 2575 scope.go:117] "RemoveContainer" containerID="69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08" Apr 16 15:26:40.729992 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.729583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx" event={"ID":"2c987b34-a142-4636-a350-24394176c8a4","Type":"ContainerDied","Data":"7c2ef72cf012578d8aa4118121dbb689c4502053921b3fe99d237165a6583244"} Apr 16 15:26:40.736900 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.736884 2575 scope.go:117] "RemoveContainer" containerID="69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08" Apr 16 15:26:40.737144 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:26:40.737127 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08\": container with ID starting with 69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08 not found: ID does not exist" containerID="69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08" Apr 16 15:26:40.737205 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.737152 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08"} err="failed to get container status \"69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08\": rpc error: code = NotFound desc = could not find container \"69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08\": container with ID starting with 69c84c55645f7cbb6a734feeb32948e12539e5a21c5a520555811c40bb57af08 not found: ID does not exist" Apr 16 15:26:40.744858 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.744837 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx"] Apr 16 15:26:40.747851 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:40.747831 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-b2be2-predictor-855dffb447-hvjpx"] Apr 16 15:26:42.208531 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:42.208497 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c987b34-a142-4636-a350-24394176c8a4" path="/var/lib/kubelet/pods/2c987b34-a142-4636-a350-24394176c8a4/volumes" Apr 16 15:26:49.726680 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:49.726641 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:26:59.726849 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:26:59.726812 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:27:09.726663 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:09.726623 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:27:10.205274 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:10.205247 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:27:10.211572 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:10.211553 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:27:19.728289 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:19.728258 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" Apr 16 15:27:27.298641 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.298611 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595"] Apr 16 15:27:27.299050 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.298831 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" containerID="cri-o://44614ecf31c3efad7c86fc347f71297f63a0bf828c82ebdf974003a2332fa670" gracePeriod=30 Apr 16 15:27:27.317765 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.317731 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs"] Apr 16 15:27:27.318024 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.318012 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" Apr 16 15:27:27.318085 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.318027 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" Apr 16 15:27:27.318125 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.318089 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c987b34-a142-4636-a350-24394176c8a4" containerName="kserve-container" Apr 16 15:27:27.321175 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.321161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" Apr 16 15:27:27.327432 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.327407 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs"] Apr 16 15:27:27.331444 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.331429 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" Apr 16 15:27:27.450151 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.450117 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs"] Apr 16 15:27:27.452880 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:27:27.452856 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6911b433_b46c_4c3f_8e79_91159bcdd409.slice/crio-e2be36c33a6d37a9a840a3c3d8db5f5e5d6756e95b5ffb82edb26184dc268bc4 WatchSource:0}: Error finding container e2be36c33a6d37a9a840a3c3d8db5f5e5d6756e95b5ffb82edb26184dc268bc4: Status 404 returned error can't find the container with id e2be36c33a6d37a9a840a3c3d8db5f5e5d6756e95b5ffb82edb26184dc268bc4 Apr 16 15:27:27.852511 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.852475 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" event={"ID":"6911b433-b46c-4c3f-8e79-91159bcdd409","Type":"ContainerStarted","Data":"e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f"} Apr 16 15:27:27.852511 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.852512 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" event={"ID":"6911b433-b46c-4c3f-8e79-91159bcdd409","Type":"ContainerStarted","Data":"e2be36c33a6d37a9a840a3c3d8db5f5e5d6756e95b5ffb82edb26184dc268bc4"} Apr 16 15:27:27.852709 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.852660 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" Apr 16 15:27:27.853834 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.853807 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:27:27.866388 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:27.866346 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podStartSLOduration=0.866335608 podStartE2EDuration="866.335608ms" podCreationTimestamp="2026-04-16 15:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:27:27.865797166 +0000 UTC m=+2118.235079639" watchObservedRunningTime="2026-04-16 15:27:27.866335608 +0000 UTC m=+2118.235618080" Apr 16 15:27:28.855543 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:28.855511 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:27:29.859701 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:29.859664 2575 generic.go:358] "Generic (PLEG): container finished" podID="22813604-60d6-47d3-baee-bab5f60578ec" containerID="44614ecf31c3efad7c86fc347f71297f63a0bf828c82ebdf974003a2332fa670" exitCode=0 Apr 16 15:27:29.860052 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:29.859743 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" event={"ID":"22813604-60d6-47d3-baee-bab5f60578ec","Type":"ContainerDied","Data":"44614ecf31c3efad7c86fc347f71297f63a0bf828c82ebdf974003a2332fa670"} Apr 16 15:27:29.940928 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:29.940903 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" Apr 16 15:27:30.863168 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:30.863133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" event={"ID":"22813604-60d6-47d3-baee-bab5f60578ec","Type":"ContainerDied","Data":"7f601834d389749b25a4ff9c4d488c3154cb0c1fef267a80fdaf1e4cb56dfbb2"} Apr 16 15:27:30.863584 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:30.863175 2575 scope.go:117] "RemoveContainer" containerID="44614ecf31c3efad7c86fc347f71297f63a0bf828c82ebdf974003a2332fa670" Apr 16 15:27:30.863584 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:30.863146 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595" Apr 16 15:27:30.878942 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:30.878919 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595"] Apr 16 15:27:30.883818 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:30.883799 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-cee76-predictor-84f4bbdc7-bf595"] Apr 16 15:27:32.204724 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:32.204691 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22813604-60d6-47d3-baee-bab5f60578ec" path="/var/lib/kubelet/pods/22813604-60d6-47d3-baee-bab5f60578ec/volumes" Apr 16 15:27:38.856139 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:38.856100 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:27:47.771656 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.771570 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb"] Apr 16 15:27:47.772115 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.771812 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" containerID="cri-o://dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0" gracePeriod=30 Apr 16 15:27:47.790035 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.790011 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8"] Apr 16 15:27:47.790293 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.790281 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" Apr 16 15:27:47.790340 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.790294 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" Apr 16 15:27:47.790340 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.790335 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="22813604-60d6-47d3-baee-bab5f60578ec" containerName="kserve-container" Apr 16 15:27:47.793159 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.793142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" Apr 16 15:27:47.799282 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.799255 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8"] Apr 16 15:27:47.803099 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.803079 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" Apr 16 15:27:47.929875 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:47.929846 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8"] Apr 16 15:27:47.933341 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:27:47.933307 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10331708_d9ee_4429_ad87_60d96ad8fb70.slice/crio-624aaff8c329275ba624e8c683dfea7d9b36f9fc318c139458148847a872625c WatchSource:0}: Error finding container 624aaff8c329275ba624e8c683dfea7d9b36f9fc318c139458148847a872625c: Status 404 returned error can't find the container with id 624aaff8c329275ba624e8c683dfea7d9b36f9fc318c139458148847a872625c Apr 16 15:27:48.855612 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:48.855576 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:27:48.913648 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:48.913621 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" event={"ID":"10331708-d9ee-4429-ad87-60d96ad8fb70","Type":"ContainerStarted","Data":"0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b"} Apr 16 15:27:48.913648 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:48.913651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" event={"ID":"10331708-d9ee-4429-ad87-60d96ad8fb70","Type":"ContainerStarted","Data":"624aaff8c329275ba624e8c683dfea7d9b36f9fc318c139458148847a872625c"} Apr 16 15:27:48.913830 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:48.913777 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" Apr 16 15:27:48.914735 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:48.914712 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 15:27:48.928087 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:48.928045 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" podStartSLOduration=1.928033327 podStartE2EDuration="1.928033327s" podCreationTimestamp="2026-04-16 15:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:27:48.926776084 +0000 UTC m=+2139.296058579" watchObservedRunningTime="2026-04-16 15:27:48.928033327 +0000 UTC m=+2139.297315798" Apr 16 15:27:49.727151 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:49.727105 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 15:27:49.916401 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:49.916364 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 15:27:50.517290 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.517267 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" Apr 16 15:27:50.920361 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.920324 2575 generic.go:358] "Generic (PLEG): container finished" podID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerID="dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0" exitCode=0 Apr 16 15:27:50.920728 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.920371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" event={"ID":"25744534-c8ff-42e9-bbd1-7a7752f5acdf","Type":"ContainerDied","Data":"dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0"} Apr 16 15:27:50.920728 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.920393 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" Apr 16 15:27:50.920728 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.920405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb" event={"ID":"25744534-c8ff-42e9-bbd1-7a7752f5acdf","Type":"ContainerDied","Data":"8c436d70369ea01a4fa85f4018f0dfde29c385476386421719f161336db47212"} Apr 16 15:27:50.920728 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.920423 2575 scope.go:117] "RemoveContainer" containerID="dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0" Apr 16 15:27:50.927458 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.927439 2575 scope.go:117] "RemoveContainer" containerID="dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0" Apr 16 15:27:50.927771 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:27:50.927705 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0\": container with ID starting with dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0 not found: ID does not exist" containerID="dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0" Apr 16 15:27:50.927771 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.927736 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0"} err="failed to get container status \"dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0\": rpc error: code = NotFound desc = could not find container \"dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0\": container with ID starting with dff025c9a90b95bfe92f6f0f7462986f2b59e340cbe9531d7a41ce3d53a177f0 not found: ID does not exist" Apr 16 15:27:50.939384 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.939355 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb"] Apr 16 15:27:50.942629 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:50.942611 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1fc59-predictor-6797966c7-jp9kb"] Apr 16 15:27:52.203777 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:52.203748 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" path="/var/lib/kubelet/pods/25744534-c8ff-42e9-bbd1-7a7752f5acdf/volumes" Apr 16 15:27:58.856536 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:58.856454 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:27:59.917099 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:27:59.917059 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 15:28:08.856502 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:28:08.856462 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:28:09.917286 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:28:09.917246 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 15:28:18.857326 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:28:18.857295 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" Apr 16 15:28:19.916887 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:28:19.916839 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 15:28:29.917265 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:28:29.917236 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" Apr 16 15:32:10.221758 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:32:10.221736 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:32:10.228409 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:32:10.228390 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:37:02.710894 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:02.710809 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8"] Apr 16 15:37:02.711468 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:02.711133 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" containerID="cri-o://0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b" gracePeriod=30 Apr 16 15:37:05.052104 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.052084 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" Apr 16 15:37:05.386148 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.386116 2575 generic.go:358] "Generic (PLEG): container finished" podID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerID="0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b" exitCode=0 Apr 16 15:37:05.386305 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.386158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" event={"ID":"10331708-d9ee-4429-ad87-60d96ad8fb70","Type":"ContainerDied","Data":"0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b"} Apr 16 15:37:05.386305 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.386180 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" Apr 16 15:37:05.386305 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.386184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8" event={"ID":"10331708-d9ee-4429-ad87-60d96ad8fb70","Type":"ContainerDied","Data":"624aaff8c329275ba624e8c683dfea7d9b36f9fc318c139458148847a872625c"} Apr 16 15:37:05.386305 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.386203 2575 scope.go:117] "RemoveContainer" containerID="0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b" Apr 16 15:37:05.394025 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.393829 2575 scope.go:117] "RemoveContainer" containerID="0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b" Apr 16 15:37:05.394232 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:37:05.394136 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b\": container with ID starting with 0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b not found: ID does not exist" containerID="0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b" Apr 16 15:37:05.394232 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.394174 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b"} err="failed to get container status \"0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b\": rpc error: code = NotFound desc = could not find container \"0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b\": container with ID starting with 0ac982032c45021b7074a1517eaa52960bbc119065926f0a73df2c79d5d5977b not found: ID does not exist" Apr 16 15:37:05.405523 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.405501 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8"] Apr 16 15:37:05.407489 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:05.407469 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-41c6b-predictor-86f5c7cb45-9h6d8"] Apr 16 15:37:06.206604 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:06.206569 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" path="/var/lib/kubelet/pods/10331708-d9ee-4429-ad87-60d96ad8fb70/volumes" Apr 16 15:37:10.239133 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:10.239034 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:37:10.251915 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:37:10.244962 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:42:10.255954 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:42:10.255851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:42:10.261689 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:42:10.261672 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:44:56.905725 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:56.905694 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs"] Apr 16 15:44:56.906257 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:56.905931 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" containerID="cri-o://e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f" gracePeriod=30 Apr 16 15:44:58.856427 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:58.856389 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 15:44:59.148149 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.148122 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" Apr 16 15:44:59.608138 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.608108 2575 generic.go:358] "Generic (PLEG): container finished" podID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerID="e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f" exitCode=0 Apr 16 15:44:59.608320 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.608155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" event={"ID":"6911b433-b46c-4c3f-8e79-91159bcdd409","Type":"ContainerDied","Data":"e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f"} Apr 16 15:44:59.608320 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.608173 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" Apr 16 15:44:59.608320 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.608195 2575 scope.go:117] "RemoveContainer" containerID="e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f" Apr 16 15:44:59.608320 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.608182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs" event={"ID":"6911b433-b46c-4c3f-8e79-91159bcdd409","Type":"ContainerDied","Data":"e2be36c33a6d37a9a840a3c3d8db5f5e5d6756e95b5ffb82edb26184dc268bc4"} Apr 16 15:44:59.615687 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.615664 2575 scope.go:117] "RemoveContainer" containerID="e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f" Apr 16 15:44:59.616015 ip-10-0-130-229 kubenswrapper[2575]: E0416 15:44:59.615930 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f\": container with ID starting with e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f not found: ID does not exist" containerID="e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f" Apr 16 15:44:59.616015 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.615962 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f"} err="failed to get container status \"e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f\": rpc error: code = NotFound desc = could not find container \"e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f\": container with ID starting with e756dcb7c0b0f4d01f676f60f8d94ea7450c98ec09e08dd0fce5c3201742750f not found: ID does not exist" Apr 16 15:44:59.628422 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.628396 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs"] Apr 16 15:44:59.631453 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:44:59.631431 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9c7d6-predictor-6d4df94467-hc4fs"] Apr 16 15:45:00.204806 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:00.204768 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" path="/var/lib/kubelet/pods/6911b433-b46c-4c3f-8e79-91159bcdd409/volumes" Apr 16 15:45:22.764427 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764395 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2ndkd/must-gather-d52kz"] Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764629 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764640 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764648 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764654 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764667 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764673 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764714 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="10331708-d9ee-4429-ad87-60d96ad8fb70" containerName="kserve-container" Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764721 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="25744534-c8ff-42e9-bbd1-7a7752f5acdf" containerName="kserve-container" Apr 16 15:45:22.764780 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.764730 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6911b433-b46c-4c3f-8e79-91159bcdd409" containerName="kserve-container" Apr 16 15:45:22.767497 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.767478 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ndkd/must-gather-d52kz" Apr 16 15:45:22.770256 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.770237 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ndkd\"/\"openshift-service-ca.crt\"" Apr 16 15:45:22.771490 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.771471 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2ndkd\"/\"default-dockercfg-jmzsl\"" Apr 16 15:45:22.771640 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.771506 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ndkd\"/\"kube-root-ca.crt\"" Apr 16 15:45:22.774159 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.774064 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ndkd/must-gather-d52kz"] Apr 16 15:45:22.831062 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.831038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwgt\" (UniqueName: \"kubernetes.io/projected/7e73a782-2b97-40e1-9612-765119fc08ac-kube-api-access-5jwgt\") pod \"must-gather-d52kz\" (UID: \"7e73a782-2b97-40e1-9612-765119fc08ac\") " pod="openshift-must-gather-2ndkd/must-gather-d52kz" Apr 16 15:45:22.831156 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.831077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e73a782-2b97-40e1-9612-765119fc08ac-must-gather-output\") pod \"must-gather-d52kz\" (UID: \"7e73a782-2b97-40e1-9612-765119fc08ac\") " pod="openshift-must-gather-2ndkd/must-gather-d52kz" Apr 16 15:45:22.931938 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.931914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwgt\" (UniqueName: \"kubernetes.io/projected/7e73a782-2b97-40e1-9612-765119fc08ac-kube-api-access-5jwgt\") pod \"must-gather-d52kz\" (UID: \"7e73a782-2b97-40e1-9612-765119fc08ac\") " pod="openshift-must-gather-2ndkd/must-gather-d52kz" Apr 16 15:45:22.932021 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.931956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e73a782-2b97-40e1-9612-765119fc08ac-must-gather-output\") pod \"must-gather-d52kz\" (UID: \"7e73a782-2b97-40e1-9612-765119fc08ac\") " pod="openshift-must-gather-2ndkd/must-gather-d52kz" Apr 16 15:45:22.932318 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.932302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7e73a782-2b97-40e1-9612-765119fc08ac-must-gather-output\") pod \"must-gather-d52kz\" (UID: \"7e73a782-2b97-40e1-9612-765119fc08ac\") " pod="openshift-must-gather-2ndkd/must-gather-d52kz" Apr 16 15:45:22.939513 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:22.939493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwgt\" (UniqueName: \"kubernetes.io/projected/7e73a782-2b97-40e1-9612-765119fc08ac-kube-api-access-5jwgt\") pod \"must-gather-d52kz\" (UID: \"7e73a782-2b97-40e1-9612-765119fc08ac\") " pod="openshift-must-gather-2ndkd/must-gather-d52kz" Apr 16 15:45:23.076749 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:23.076687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ndkd/must-gather-d52kz" Apr 16 15:45:23.185088 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:23.185061 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ndkd/must-gather-d52kz"] Apr 16 15:45:23.187992 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:45:23.187960 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e73a782_2b97_40e1_9612_765119fc08ac.slice/crio-87ef346799b01c50fc3c3450dd452dbe10dc445c15c8e0fd3f9bc36f569dcbe8 WatchSource:0}: Error finding container 87ef346799b01c50fc3c3450dd452dbe10dc445c15c8e0fd3f9bc36f569dcbe8: Status 404 returned error can't find the container with id 87ef346799b01c50fc3c3450dd452dbe10dc445c15c8e0fd3f9bc36f569dcbe8 Apr 16 15:45:23.189592 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:23.189576 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:45:23.668570 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:23.668537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/must-gather-d52kz" event={"ID":"7e73a782-2b97-40e1-9612-765119fc08ac","Type":"ContainerStarted","Data":"87ef346799b01c50fc3c3450dd452dbe10dc445c15c8e0fd3f9bc36f569dcbe8"} Apr 16 15:45:24.673392 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:24.673357 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/must-gather-d52kz" event={"ID":"7e73a782-2b97-40e1-9612-765119fc08ac","Type":"ContainerStarted","Data":"e4ff8a69e0f81998e230d2dbfc883f619920d99aba53309ee09650b8dfd5b699"} Apr 16 15:45:24.673392 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:24.673398 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/must-gather-d52kz" event={"ID":"7e73a782-2b97-40e1-9612-765119fc08ac","Type":"ContainerStarted","Data":"5e78d6069f3755f0bd69ceecd4de57510ae861a236e01f6da2c43ea5cc166f1b"} Apr 16 15:45:24.688352 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:24.688312 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2ndkd/must-gather-d52kz" podStartSLOduration=1.632169765 podStartE2EDuration="2.688295832s" podCreationTimestamp="2026-04-16 15:45:22 +0000 UTC" firstStartedPulling="2026-04-16 15:45:23.189703941 +0000 UTC m=+3193.558986391" lastFinishedPulling="2026-04-16 15:45:24.245830008 +0000 UTC m=+3194.615112458" observedRunningTime="2026-04-16 15:45:24.687187604 +0000 UTC m=+3195.056470088" watchObservedRunningTime="2026-04-16 15:45:24.688295832 +0000 UTC m=+3195.057578310" Apr 16 15:45:25.600621 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:25.600587 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kjjnh_0892e381-08bb-4454-99af-9dd414b35525/global-pull-secret-syncer/0.log" Apr 16 15:45:25.748985 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:25.748957 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cfqtw_497451ed-b0c8-4109-a9ab-9689ebba5dba/konnectivity-agent/0.log" Apr 16 15:45:25.848445 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:25.848421 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-229.ec2.internal_e289990daac467a09b205e75b6286d5f/haproxy/0.log" Apr 16 15:45:29.542112 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:29.542071 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qkrz2_89195b92-e59e-4af4-b4c4-06dc7d7e796e/node-exporter/0.log" Apr 16 15:45:29.564016 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:29.563987 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qkrz2_89195b92-e59e-4af4-b4c4-06dc7d7e796e/kube-rbac-proxy/0.log" Apr 16 15:45:29.587097 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:29.587074 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qkrz2_89195b92-e59e-4af4-b4c4-06dc7d7e796e/init-textfile/0.log" Apr 16 15:45:31.202035 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:31.202007 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-5wrz2_8b8f0df0-4d3f-4fdd-894b-fd928f0d7481/networking-console-plugin/0.log" Apr 16 15:45:32.284908 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.284879 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9"] Apr 16 15:45:32.289190 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.289168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.297739 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.297718 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9"] Apr 16 15:45:32.413056 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.413026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-lib-modules\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.413238 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.413077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-sys\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.413238 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.413144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzmfh\" (UniqueName: \"kubernetes.io/projected/abbc6dc9-3acb-493c-a101-ea3044e473fc-kube-api-access-qzmfh\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.413238 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.413195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-proc\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.413238 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.413238 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-podres\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.513622 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-sys\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.513786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzmfh\" (UniqueName: \"kubernetes.io/projected/abbc6dc9-3acb-493c-a101-ea3044e473fc-kube-api-access-qzmfh\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.513786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-proc\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.513786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-podres\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.513786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-lib-modules\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.513786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-sys\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.513786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-proc\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.513786 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-lib-modules\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.514015 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.513798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abbc6dc9-3acb-493c-a101-ea3044e473fc-podres\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.521028 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.521009 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzmfh\" (UniqueName: \"kubernetes.io/projected/abbc6dc9-3acb-493c-a101-ea3044e473fc-kube-api-access-qzmfh\") pod \"perf-node-gather-daemonset-475c9\" (UID: \"abbc6dc9-3acb-493c-a101-ea3044e473fc\") " pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.601267 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.601246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:32.732435 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:32.732408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9"] Apr 16 15:45:32.742169 ip-10-0-130-229 kubenswrapper[2575]: W0416 15:45:32.740721 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podabbc6dc9_3acb_493c_a101_ea3044e473fc.slice/crio-ff6a6c87d31ada8f9d2d50b225f10a677bbe040c7439664e53eaba1e7ce47c92 WatchSource:0}: Error finding container ff6a6c87d31ada8f9d2d50b225f10a677bbe040c7439664e53eaba1e7ce47c92: Status 404 returned error can't find the container with id ff6a6c87d31ada8f9d2d50b225f10a677bbe040c7439664e53eaba1e7ce47c92 Apr 16 15:45:33.065145 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:33.065075 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m6qlb_7badd0a7-a664-4046-8cb3-c1bf570dc29b/dns/0.log" Apr 16 15:45:33.084364 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:33.084338 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m6qlb_7badd0a7-a664-4046-8cb3-c1bf570dc29b/kube-rbac-proxy/0.log" Apr 16 15:45:33.147998 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:33.147977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4vrpg_55ad67c5-7bb1-4c6c-8c58-869beff80d7f/dns-node-resolver/0.log" Apr 16 15:45:33.612432 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:33.612407 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pnmnk_38c4c3ee-509f-467b-b050-9b723bfce014/node-ca/0.log" Apr 16 15:45:33.704790 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:33.704757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" event={"ID":"abbc6dc9-3acb-493c-a101-ea3044e473fc","Type":"ContainerStarted","Data":"b25066f19d169a8343580f43843ca3980a00c296bf2348f21b3f4ff6e47ad3cf"} Apr 16 15:45:33.704948 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:33.704796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" event={"ID":"abbc6dc9-3acb-493c-a101-ea3044e473fc","Type":"ContainerStarted","Data":"ff6a6c87d31ada8f9d2d50b225f10a677bbe040c7439664e53eaba1e7ce47c92"} Apr 16 15:45:33.704948 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:33.704889 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:33.720350 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:33.720310 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" podStartSLOduration=1.720297538 podStartE2EDuration="1.720297538s" podCreationTimestamp="2026-04-16 15:45:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:45:33.719588803 +0000 UTC m=+3204.088871276" watchObservedRunningTime="2026-04-16 15:45:33.720297538 +0000 UTC m=+3204.089580012" Apr 16 15:45:34.551749 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:34.551717 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cvpwl_5803f957-e9d1-4ccf-a732-54889272611a/serve-healthcheck-canary/0.log" Apr 16 15:45:35.065138 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:35.065112 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rghv9_e9a74039-5a10-491d-97b9-0057432158ab/kube-rbac-proxy/0.log" Apr 16 15:45:35.085558 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:35.085532 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rghv9_e9a74039-5a10-491d-97b9-0057432158ab/exporter/0.log" Apr 16 15:45:35.104942 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:35.104917 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rghv9_e9a74039-5a10-491d-97b9-0057432158ab/extractor/0.log" Apr 16 15:45:36.877892 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:36.877865 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7669bdc57-cs48h_e25beb85-308f-4736-abc2-ce6cbe14a1cc/manager/0.log" Apr 16 15:45:36.918936 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:36.918911 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-vgvrp_6bb090bd-89f7-4a8b-a80d-d41294f7df52/server/0.log" Apr 16 15:45:37.181888 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:37.181820 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-cjjf2_8bcff461-ab8a-4d95-8e77-101e852f5b21/manager/0.log" Apr 16 15:45:39.717752 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:39.717722 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2ndkd/perf-node-gather-daemonset-475c9" Apr 16 15:45:42.040840 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.040814 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pkk7_e0f509b4-277a-43cc-ab72-a486e31674af/kube-multus-additional-cni-plugins/0.log" Apr 16 15:45:42.065654 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.065629 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pkk7_e0f509b4-277a-43cc-ab72-a486e31674af/egress-router-binary-copy/0.log" Apr 16 15:45:42.087993 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.087974 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pkk7_e0f509b4-277a-43cc-ab72-a486e31674af/cni-plugins/0.log" Apr 16 15:45:42.109069 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.109048 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pkk7_e0f509b4-277a-43cc-ab72-a486e31674af/bond-cni-plugin/0.log" Apr 16 15:45:42.130270 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.130247 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pkk7_e0f509b4-277a-43cc-ab72-a486e31674af/routeoverride-cni/0.log" Apr 16 15:45:42.150246 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.150221 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pkk7_e0f509b4-277a-43cc-ab72-a486e31674af/whereabouts-cni-bincopy/0.log" Apr 16 15:45:42.169992 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.169940 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9pkk7_e0f509b4-277a-43cc-ab72-a486e31674af/whereabouts-cni/0.log" Apr 16 15:45:42.533918 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.533840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sg7qn_74f9607d-cf87-4aa2-af48-8f1cbac463ed/kube-multus/0.log" Apr 16 15:45:42.646962 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.646933 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qf2x7_012eecca-9f9b-4a13-8adc-05b585fd794b/network-metrics-daemon/0.log" Apr 16 15:45:42.667000 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:42.666970 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qf2x7_012eecca-9f9b-4a13-8adc-05b585fd794b/kube-rbac-proxy/0.log" Apr 16 15:45:43.401369 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.401345 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-controller/0.log" Apr 16 15:45:43.423099 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.423078 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/0.log" Apr 16 15:45:43.437399 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.437381 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovn-acl-logging/1.log" Apr 16 15:45:43.456904 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.456854 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/kube-rbac-proxy-node/0.log" Apr 16 15:45:43.477105 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.477079 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:45:43.494431 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.494410 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/northd/0.log" Apr 16 15:45:43.519357 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.519332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/nbdb/0.log" Apr 16 15:45:43.539444 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.539425 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/sbdb/0.log" Apr 16 15:45:43.664660 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:43.664631 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-75cxs_63b14197-55a5-4407-8c24-397ab7006750/ovnkube-controller/0.log" Apr 16 15:45:45.204072 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:45.204017 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-85ccl_4a74642f-0b47-4e56-931c-041808066f04/network-check-target-container/0.log" Apr 16 15:45:46.065195 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:46.065164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-cpd95_e9568559-bb27-4f69-aa6e-e169bbfd3048/iptables-alerter/0.log" Apr 16 15:45:46.606524 ip-10-0-130-229 kubenswrapper[2575]: I0416 15:45:46.606496 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-b64tn_39650afc-ce1c-4648-83c6-5b4969c0db6a/tuned/0.log"