Apr 24 22:29:36.852998 ip-10-0-142-202 systemd[1]: Starting Kubernetes Kubelet... Apr 24 22:29:37.363129 ip-10-0-142-202 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:37.363129 ip-10-0-142-202 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 22:29:37.363129 ip-10-0-142-202 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:37.363129 ip-10-0-142-202 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 22:29:37.363129 ip-10-0-142-202 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 22:29:37.365347 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.365174 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 22:29:37.369674 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369660 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:37.369674 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369674 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369679 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369682 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369686 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369690 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369695 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369699 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369702 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369705 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369708 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369711 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369713 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369716 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369720 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369722 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369725 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369728 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369731 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369733 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:37.369734 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369741 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369744 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369747 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369750 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369753 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369755 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369758 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369761 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369764 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369766 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369769 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369774 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369777 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369780 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369783 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369786 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369789 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369792 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369795 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369798 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:37.370181 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369800 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369803 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369806 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369809 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369811 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369814 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369817 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369819 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369822 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369824 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369827 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369829 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369832 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369835 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369838 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369841 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369844 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369847 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369850 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369852 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:37.370682 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369855 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369858 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369860 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369863 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369866 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369868 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369871 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369874 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369878 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369880 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369883 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369886 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369888 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369891 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369893 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369897 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369899 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369902 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369905 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369907 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:37.371174 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369910 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369913 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369916 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369918 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369921 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.369923 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370303 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370308 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370311 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370313 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370316 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370320 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370322 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370325 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370328 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370330 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370333 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370336 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370339 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:37.371697 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370343 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370346 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370349 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370351 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370354 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370357 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370360 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370362 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370369 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370372 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370375 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370378 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370382 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370385 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370388 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370391 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370394 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370397 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370400 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:37.372155 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370403 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370406 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370408 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370411 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370414 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370416 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370419 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370422 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370424 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370427 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370429 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370432 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370435 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370438 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370440 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370443 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370445 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370448 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370451 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370454 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:37.372642 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370457 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370459 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370462 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370465 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370467 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370470 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370472 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370475 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370478 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370481 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370483 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370486 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370489 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370492 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370495 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370498 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370500 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370503 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370506 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370508 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:37.373129 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370511 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370514 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370517 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370520 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370522 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370525 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370527 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370530 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370532 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370535 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370537 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370540 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370542 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.370545 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370628 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370637 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370644 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370648 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370653 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370656 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370661 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 22:29:37.373628 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370666 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370670 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370673 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370677 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370680 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370684 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370687 2565 flags.go:64] FLAG: --cgroup-root="" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370690 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370694 2565 flags.go:64] FLAG: --client-ca-file="" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370697 2565 flags.go:64] FLAG: --cloud-config="" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370700 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370704 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370708 2565 flags.go:64] FLAG: --cluster-domain="" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370711 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370714 2565 flags.go:64] FLAG: --config-dir="" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370718 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370721 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370725 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370728 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370731 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370734 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370737 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370740 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370743 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370747 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 22:29:37.374156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370750 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370754 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370758 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370761 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370764 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370767 2565 flags.go:64] FLAG: --enable-server="true" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370771 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370776 2565 flags.go:64] FLAG: --event-burst="100" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370779 2565 flags.go:64] FLAG: --event-qps="50" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370782 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370786 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370789 2565 flags.go:64] FLAG: --eviction-hard="" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370793 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370796 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370799 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370803 2565 flags.go:64] FLAG: --eviction-soft="" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370806 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370809 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370812 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370815 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370818 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370821 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370824 2565 flags.go:64] FLAG: --feature-gates="" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370828 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370831 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 22:29:37.374764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370834 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370837 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370841 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370844 2565 flags.go:64] FLAG: --help="false" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370847 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370850 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370853 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370856 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370860 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370863 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370867 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370870 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370873 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370877 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370880 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370883 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370886 2565 flags.go:64] FLAG: --kube-reserved="" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370889 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370892 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370895 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370898 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370902 2565 flags.go:64] FLAG: --lock-file="" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370905 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370908 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 22:29:37.375378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370911 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370916 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370919 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370922 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370926 2565 flags.go:64] FLAG: --logging-format="text" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370929 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370932 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370935 2565 flags.go:64] FLAG: --manifest-url="" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370938 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370943 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370946 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370951 2565 flags.go:64] FLAG: --max-pods="110" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370954 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370957 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370960 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370963 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370966 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370969 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370973 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370980 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370984 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370987 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370991 2565 flags.go:64] FLAG: --pod-cidr="" Apr 24 22:29:37.375974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.370994 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371000 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371004 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371007 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371010 2565 flags.go:64] FLAG: --port="10250" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371014 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371017 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0670e02fe6d3e1675" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371020 2565 flags.go:64] FLAG: --qos-reserved="" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371024 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371027 2565 flags.go:64] FLAG: --register-node="true" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371030 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371033 2565 flags.go:64] FLAG: --register-with-taints="" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371036 2565 flags.go:64] FLAG: --registry-burst="10" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371039 2565 flags.go:64] FLAG: --registry-qps="5" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371043 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371046 2565 flags.go:64] FLAG: --reserved-memory="" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371050 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371053 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371056 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371059 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371062 2565 flags.go:64] FLAG: --runonce="false" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371069 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371073 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371076 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371079 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371082 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 22:29:37.376660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371085 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371089 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371092 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371096 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371099 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371101 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371104 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371108 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371120 2565 flags.go:64] FLAG: --system-cgroups="" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371123 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371129 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371132 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371135 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371140 2565 flags.go:64] FLAG: --tls-min-version="" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371142 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371145 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371148 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371151 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371154 2565 flags.go:64] FLAG: --v="2" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371161 2565 flags.go:64] FLAG: --version="false" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371168 2565 flags.go:64] FLAG: --vmodule="" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371172 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.371176 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371264 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:37.377294 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371268 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371271 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371274 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371278 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371281 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371284 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371286 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371289 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371292 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371295 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371297 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371300 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371304 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371306 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371309 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371312 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371314 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371317 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371320 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371322 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:37.377893 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371325 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371328 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371330 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371333 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371336 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371338 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371341 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371345 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371348 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371350 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371353 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371356 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371358 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371361 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371364 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371368 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371370 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371373 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371376 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371378 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:37.378414 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371381 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371384 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371387 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371389 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371393 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371397 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371400 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371403 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371406 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371408 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371411 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371414 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371416 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371419 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371421 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371424 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371428 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371431 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:37.378926 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371434 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371441 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371444 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371447 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371450 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371453 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371456 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371458 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371461 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371465 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371468 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371470 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371473 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371476 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371478 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371481 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371484 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371487 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371489 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371492 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:37.379375 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371495 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371498 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371501 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371503 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371506 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371508 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.371511 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.372483 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.379075 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.379089 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379144 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379150 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379153 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379156 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379159 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379163 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:37.379918 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379166 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379168 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379171 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379175 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379178 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379181 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379183 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379186 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379189 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379193 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379196 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379199 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379202 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379204 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379207 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379210 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379212 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379215 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379217 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:37.380310 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379220 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379222 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379226 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379229 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379231 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379233 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379238 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379242 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379247 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379249 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379252 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379255 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379258 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379261 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379264 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379267 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379270 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379273 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379275 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379278 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:37.380850 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379281 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379283 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379286 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379289 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379292 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379295 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379297 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379300 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379303 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379305 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379308 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379311 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379314 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379316 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379319 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379322 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379324 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379327 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379331 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:37.381345 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379334 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379337 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379340 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379343 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379345 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379348 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379351 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379353 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379356 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379359 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379361 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379364 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379366 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379369 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379372 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379374 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379377 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379380 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379383 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379385 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:37.381823 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379388 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379391 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.379396 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379493 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379498 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379501 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379504 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379508 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379511 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379513 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379516 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379519 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379522 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379525 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379528 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 22:29:37.382298 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379532 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379535 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379538 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379542 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379544 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379547 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379550 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379552 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379555 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379558 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379560 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379563 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379566 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379568 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379587 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379592 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379595 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379598 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379601 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379603 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 22:29:37.382683 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379606 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379609 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379611 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379614 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379617 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379620 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379622 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379625 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379628 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379630 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379633 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379637 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379639 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379642 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379645 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379647 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379650 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379653 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379655 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379658 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 22:29:37.383173 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379661 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379663 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379666 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379669 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379671 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379674 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379677 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379679 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379682 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379684 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379687 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379690 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379692 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379695 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379697 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379700 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379703 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379706 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379708 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379711 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 22:29:37.383670 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379713 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379716 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379719 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379721 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379724 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379727 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379729 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379732 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379735 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379738 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379740 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379743 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379745 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:37.379748 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.379753 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 22:29:37.384162 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.380611 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 22:29:37.385247 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.385234 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 22:29:37.386465 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.386453 2565 server.go:1019] "Starting client certificate rotation" Apr 24 22:29:37.386565 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.386549 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:37.386681 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.386599 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 22:29:37.413376 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.413357 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:37.416099 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.416082 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 22:29:37.431951 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.431930 2565 log.go:25] "Validated CRI v1 runtime API" Apr 24 22:29:37.438956 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.438938 2565 log.go:25] "Validated CRI v1 image API" Apr 24 22:29:37.440282 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.440267 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 22:29:37.441402 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.441384 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:37.444315 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.444297 2565 fs.go:135] Filesystem UUIDs: map[1a89982d-71d4-4a32-9cdf-b3d0f0e08826:/dev/nvme0n1p3 6e39291b-2067-4b08-9a95-f27a01fc8ec3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 22:29:37.444366 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.444316 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 22:29:37.450245 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.450139 2565 manager.go:217] Machine: {Timestamp:2026-04-24 22:29:37.447971031 +0000 UTC m=+0.463655437 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099319 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2007bdf53fbed1c650f35e7d28d151 SystemUUID:ec2007bd-f53f-bed1-c650-f35e7d28d151 BootID:9c88533f-8f46-435c-a3f1-f4feef8cfdbc Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fd:85:30:75:97 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fd:85:30:75:97 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:da:29:c4:58:e8:b6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 22:29:37.450245 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.450242 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 22:29:37.450346 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.450320 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 22:29:37.451419 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.451398 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 22:29:37.451553 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.451420 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-202.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 22:29:37.451608 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.451563 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 22:29:37.451608 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.451583 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 22:29:37.451608 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.451596 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:37.452421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.452411 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 22:29:37.454488 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.454478 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:37.454653 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.454644 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 22:29:37.457524 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.457514 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 24 22:29:37.457555 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.457533 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 22:29:37.457555 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.457545 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 22:29:37.457629 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.457557 2565 kubelet.go:397] "Adding apiserver pod source" Apr 24 22:29:37.457629 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.457588 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 22:29:37.458789 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.458775 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:37.458828 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.458801 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 22:29:37.461978 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.461962 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 22:29:37.463786 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.463773 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 22:29:37.465954 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.465943 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 22:29:37.465996 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.465960 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 22:29:37.465996 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.465966 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 22:29:37.465996 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.465971 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 22:29:37.465996 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.465977 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 22:29:37.465996 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.465983 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 22:29:37.465996 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.465989 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 22:29:37.465996 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.465994 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 22:29:37.466177 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.466001 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 22:29:37.466177 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.466007 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 22:29:37.466177 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.466016 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 22:29:37.466177 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.466025 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 22:29:37.467039 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.467030 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 22:29:37.467082 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.467039 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 22:29:37.470774 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.470759 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 22:29:37.470859 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.470795 2565 server.go:1295] "Started kubelet" Apr 24 22:29:37.472606 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.472535 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 22:29:37.472670 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.472634 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 22:29:37.473138 ip-10-0-142-202 systemd[1]: Started Kubernetes Kubelet. Apr 24 22:29:37.473611 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.473425 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 22:29:37.473611 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.473517 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-202.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 22:29:37.473940 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.473840 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-202.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 22:29:37.474007 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.473961 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 22:29:37.474146 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.474046 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 22:29:37.477236 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.477218 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 24 22:29:37.481482 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.480453 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-202.ec2.internal.18a96b8c1b3a4c72 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-202.ec2.internal,UID:ip-10-0-142-202.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-202.ec2.internal,},FirstTimestamp:2026-04-24 22:29:37.47077029 +0000 UTC m=+0.486454681,LastTimestamp:2026-04-24 22:29:37.47077029 +0000 UTC m=+0.486454681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-202.ec2.internal,}" Apr 24 22:29:37.483230 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.483211 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 22:29:37.483230 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.483226 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:37.484064 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484017 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 22:29:37.484151 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484064 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 22:29:37.484151 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484080 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 22:29:37.484151 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484142 2565 factory.go:153] Registering CRI-O factory Apr 24 22:29:37.484151 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484145 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 24 22:29:37.484151 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484155 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 24 22:29:37.484373 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484161 2565 factory.go:223] Registration of the crio container factory successfully Apr 24 22:29:37.484373 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.484162 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:37.484373 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484206 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 22:29:37.484373 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484213 2565 factory.go:55] Registering systemd factory Apr 24 22:29:37.484373 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484219 2565 factory.go:223] Registration of the systemd container factory successfully Apr 24 22:29:37.484373 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484238 2565 factory.go:103] Registering Raw factory Apr 24 22:29:37.484373 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484247 2565 manager.go:1196] Started watching for new ooms in manager Apr 24 22:29:37.484687 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.484568 2565 manager.go:319] Starting recovery of all containers Apr 24 22:29:37.486234 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.486217 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 22:29:37.486537 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.486522 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fd48d" Apr 24 22:29:37.494008 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.493983 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fd48d" Apr 24 22:29:37.494391 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.494361 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 22:29:37.494524 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.494473 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-202.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 22:29:37.495365 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.495350 2565 manager.go:324] Recovery completed Apr 24 22:29:37.496567 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.496546 2565 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 22:29:37.499279 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.499265 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:37.502286 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.502268 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:37.502367 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.502296 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:37.502367 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.502306 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:37.502805 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.502789 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 22:29:37.502805 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.502805 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 22:29:37.502897 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.502822 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 24 22:29:37.506802 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.506790 2565 policy_none.go:49] "None policy: Start" Apr 24 22:29:37.506846 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.506823 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 22:29:37.506846 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.506834 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.546766 2565 manager.go:341] "Starting Device Plugin manager" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.546800 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.546810 2565 server.go:85] "Starting device plugin registration server" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.546996 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.547004 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.547115 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.547202 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.547210 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.547718 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 22:29:37.562760 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.547753 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:37.569639 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.569616 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 22:29:37.570797 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.570785 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 22:29:37.570856 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.570811 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 22:29:37.570856 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.570827 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 22:29:37.570856 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.570833 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 22:29:37.570980 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.570897 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 22:29:37.576384 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.576368 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:37.647524 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.647455 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:37.649651 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.649629 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:37.649740 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.649660 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:37.649740 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.649671 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:37.649740 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.649694 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.659330 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.659315 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.659374 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.659339 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-202.ec2.internal\": node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:37.671289 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.671268 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal"] Apr 24 22:29:37.671355 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.671323 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:37.672037 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.672018 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:37.672118 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.672048 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:37.672118 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.672060 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:37.673362 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.673347 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:37.674267 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.674254 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:37.674423 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.674407 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.674486 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.674444 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:37.675571 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.675558 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:37.675649 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.675594 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:37.675649 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.675608 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:37.675649 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.675609 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:37.675649 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.675635 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:37.675649 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.675643 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:37.677753 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.677738 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.677804 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.677761 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 22:29:37.678452 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.678435 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientMemory" Apr 24 22:29:37.678519 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.678463 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 22:29:37.678519 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.678472 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeHasSufficientPID" Apr 24 22:29:37.702094 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.702070 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-202.ec2.internal\" not found" node="ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.706140 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.706124 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-202.ec2.internal\" not found" node="ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.773791 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.773765 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:37.785235 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.785219 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9cfb93022e523476bc812de84da59f57-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal\" (UID: \"9cfb93022e523476bc812de84da59f57\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.785300 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.785244 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cfb93022e523476bc812de84da59f57-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal\" (UID: \"9cfb93022e523476bc812de84da59f57\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.785300 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.785262 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90b166908a0de8e6af429a39155ebb21-config\") pod \"kube-apiserver-proxy-ip-10-0-142-202.ec2.internal\" (UID: \"90b166908a0de8e6af429a39155ebb21\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.874688 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.874651 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:37.886008 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.885990 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cfb93022e523476bc812de84da59f57-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal\" (UID: \"9cfb93022e523476bc812de84da59f57\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.886057 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.886014 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90b166908a0de8e6af429a39155ebb21-config\") pod \"kube-apiserver-proxy-ip-10-0-142-202.ec2.internal\" (UID: \"90b166908a0de8e6af429a39155ebb21\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.886057 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.886034 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9cfb93022e523476bc812de84da59f57-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal\" (UID: \"9cfb93022e523476bc812de84da59f57\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.886116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.886079 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9cfb93022e523476bc812de84da59f57-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal\" (UID: \"9cfb93022e523476bc812de84da59f57\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.886116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.886096 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9cfb93022e523476bc812de84da59f57-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal\" (UID: \"9cfb93022e523476bc812de84da59f57\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.886178 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:37.886108 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/90b166908a0de8e6af429a39155ebb21-config\") pod \"kube-apiserver-proxy-ip-10-0-142-202.ec2.internal\" (UID: \"90b166908a0de8e6af429a39155ebb21\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" Apr 24 22:29:37.975434 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:37.975406 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.003893 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.003865 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:38.008594 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.008563 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" Apr 24 22:29:38.076525 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.076340 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.176829 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.176801 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.277422 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.277339 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.377987 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.377947 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.386279 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.386259 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 22:29:38.386427 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.386410 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 22:29:38.466479 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.466449 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:38.478193 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.478167 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.483340 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.483310 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 22:29:38.491207 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.491185 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 22:29:38.497564 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.497535 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 22:24:37 +0000 UTC" deadline="2028-01-28 03:45:43.025352218 +0000 UTC" Apr 24 22:29:38.497564 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.497564 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15437h16m4.527792596s" Apr 24 22:29:38.513292 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.513273 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gbwfn" Apr 24 22:29:38.519078 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.519059 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gbwfn" Apr 24 22:29:38.579045 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.578959 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.598295 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:38.598255 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cfb93022e523476bc812de84da59f57.slice/crio-0852f4e4f80497b90d17d1049bdd9217e4f5e9b4019c24efb6a838a89d530ab3 WatchSource:0}: Error finding container 0852f4e4f80497b90d17d1049bdd9217e4f5e9b4019c24efb6a838a89d530ab3: Status 404 returned error can't find the container with id 0852f4e4f80497b90d17d1049bdd9217e4f5e9b4019c24efb6a838a89d530ab3 Apr 24 22:29:38.598721 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:38.598695 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b166908a0de8e6af429a39155ebb21.slice/crio-585ec34af61dc46400591668d1225e9e9658f90fdf2e2292d4360c89bcec9e13 WatchSource:0}: Error finding container 585ec34af61dc46400591668d1225e9e9658f90fdf2e2292d4360c89bcec9e13: Status 404 returned error can't find the container with id 585ec34af61dc46400591668d1225e9e9658f90fdf2e2292d4360c89bcec9e13 Apr 24 22:29:38.603385 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.603370 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:29:38.679592 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.679541 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.690246 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.689220 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:38.780033 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.780004 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.880569 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:38.880503 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-202.ec2.internal\" not found" Apr 24 22:29:38.897208 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.897185 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:38.984344 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.984313 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" Apr 24 22:29:38.996984 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.996951 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:38.998036 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:38.998014 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" Apr 24 22:29:39.005833 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.005790 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 22:29:39.458123 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.458094 2565 apiserver.go:52] "Watching apiserver" Apr 24 22:29:39.464808 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.464781 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 22:29:39.465211 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.465187 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-tk5jk","openshift-cluster-node-tuning-operator/tuned-lsvxp","openshift-dns/node-resolver-ztc7p","openshift-multus/multus-additional-cni-plugins-swjrh","openshift-multus/multus-pmbqp","openshift-network-diagnostics/network-check-target-lgf7j","openshift-network-operator/iptables-alerter-gkgwr","openshift-ovn-kubernetes/ovnkube-node-sm54g","kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx","openshift-image-registry/node-ca-c2mrs","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal","openshift-multus/network-metrics-daemon-9bjhd"] Apr 24 22:29:39.470363 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.470335 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.472443 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.472423 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:39.472556 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.472479 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 22:29:39.472637 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.472571 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.472689 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.472661 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sj7g7\"" Apr 24 22:29:39.472733 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.472708 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:39.474390 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.474344 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 22:29:39.474493 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.474345 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tsrpb\"" Apr 24 22:29:39.474665 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.474644 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 22:29:39.474858 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.474840 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.475192 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.474935 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.476500 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.476479 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gng82\"" Apr 24 22:29:39.476618 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.476514 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 22:29:39.476618 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.476569 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 22:29:39.476795 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.476780 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-86tzl\"" Apr 24 22:29:39.476795 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.476788 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 22:29:39.476882 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.476840 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 22:29:39.477019 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.477004 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 22:29:39.477213 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.477195 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 22:29:39.477295 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.477218 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 22:29:39.477396 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.477364 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.478922 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.478892 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 22:29:39.479059 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.479017 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-v5cl9\"" Apr 24 22:29:39.479667 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.479649 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:39.479754 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:39.479732 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:39.481838 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.481818 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:39.485006 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.484737 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-t2fll\"" Apr 24 22:29:39.485006 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.484998 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 22:29:39.485148 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.485040 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 22:29:39.485431 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.485411 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.487496 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.487476 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 22:29:39.487604 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.487511 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.487753 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.487734 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 22:29:39.488113 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.488094 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h949l\"" Apr 24 22:29:39.488193 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.488093 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 22:29:39.488193 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.488183 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 22:29:39.488280 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.488104 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 22:29:39.488499 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.488479 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 22:29:39.489253 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.489234 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4847g\"" Apr 24 22:29:39.489346 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.489276 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 22:29:39.489487 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.489469 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 22:29:39.489566 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.489498 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 22:29:39.489858 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.489842 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.491416 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.491399 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 22:29:39.491534 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.491517 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 22:29:39.491624 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.491548 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 22:29:39.491770 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.491749 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7xlk2\"" Apr 24 22:29:39.492126 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.492111 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:39.492197 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:39.492168 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:39.494045 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.493896 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-modprobe-d\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.494045 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.493933 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkg2s\" (UniqueName: \"kubernetes.io/projected/813a2f60-88c4-4200-a499-4d307772c734-kube-api-access-kkg2s\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.494045 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.493962 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-netns\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.494045 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.493989 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-multus-certs\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.494251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494062 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-tmp-dir\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.494251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494101 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlq5w\" (UniqueName: \"kubernetes.io/projected/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-kube-api-access-jlq5w\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.494251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494146 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-daemon-config\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.494251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494172 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-kubelet\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.494251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494196 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/813a2f60-88c4-4200-a499-4d307772c734-etc-tuned\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.494251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494219 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-var-lib-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.494251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494244 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494267 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-lib-modules\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494293 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-cni-bin\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494314 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-cni-netd\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494339 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsd6t\" (UniqueName: \"kubernetes.io/projected/a1918ea1-23d1-4627-af99-2e000c93ecfd-kube-api-access-dsd6t\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494388 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-host-slash\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494416 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494440 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-hostroot\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494456 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-log-socket\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494492 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-cnibin\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494522 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysctl-conf\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.494559 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494548 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-conf-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494593 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-ovn\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494618 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovnkube-config\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494641 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-hosts-file\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494666 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494692 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysctl-d\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494715 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-run\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494739 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-host\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494764 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-etc-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494788 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494813 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-cnibin\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494844 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-slash\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494870 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-systemd\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494914 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-iptables-alerter-script\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494941 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-system-cni-dir\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.494989 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-systemd\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.495094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495013 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-var-lib-kubelet\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495037 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-node-log\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495060 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-cni-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495085 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-k8s-cni-cncf-io\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495108 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovn-node-metrics-cert\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495131 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-system-cni-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495155 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-os-release\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495184 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-os-release\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495207 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-sys\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495233 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495256 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495283 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr85w\" (UniqueName: \"kubernetes.io/projected/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-kube-api-access-vr85w\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495344 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6r2\" (UniqueName: \"kubernetes.io/projected/23b22676-e2ab-4cd8-97f3-c119a27160e7-kube-api-access-nq6r2\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495381 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/813a2f60-88c4-4200-a499-4d307772c734-tmp\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495409 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22314c26-39e2-493f-a8fc-95107d7fe18b-cni-binary-copy\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495436 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-kubelet\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.495730 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495460 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9179cf17-8c50-479c-8b4c-bb3b4144c5ec-konnectivity-ca\") pod \"konnectivity-agent-tk5jk\" (UID: \"9179cf17-8c50-479c-8b4c-bb3b4144c5ec\") " pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495485 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-run-netns\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495513 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-cni-bin\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495543 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysconfig\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495596 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-socket-dir-parent\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495637 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9179cf17-8c50-479c-8b4c-bb3b4144c5ec-agent-certs\") pod \"konnectivity-agent-tk5jk\" (UID: \"9179cf17-8c50-479c-8b4c-bb3b4144c5ec\") " pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495668 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-env-overrides\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495709 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-kubernetes\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495737 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-cni-multus\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495791 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-etc-kubernetes\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495819 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-systemd-units\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495843 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovnkube-script-lib\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495871 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc8fw\" (UniqueName: \"kubernetes.io/projected/22314c26-39e2-493f-a8fc-95107d7fe18b-kube-api-access-wc8fw\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495901 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.496312 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.495969 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:39.519841 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.519803 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:38 +0000 UTC" deadline="2027-11-07 13:15:26.776366603 +0000 UTC" Apr 24 22:29:39.519841 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.519835 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13478h45m47.256534931s" Apr 24 22:29:39.572140 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.572105 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 22:29:39.574969 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.574911 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" event={"ID":"90b166908a0de8e6af429a39155ebb21","Type":"ContainerStarted","Data":"585ec34af61dc46400591668d1225e9e9658f90fdf2e2292d4360c89bcec9e13"} Apr 24 22:29:39.575975 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.575951 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" event={"ID":"9cfb93022e523476bc812de84da59f57","Type":"ContainerStarted","Data":"0852f4e4f80497b90d17d1049bdd9217e4f5e9b4019c24efb6a838a89d530ab3"} Apr 24 22:29:39.585156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.585134 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 22:29:39.596243 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596220 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-daemon-config\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.596353 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596249 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-kubelet\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596353 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596265 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/813a2f60-88c4-4200-a499-4d307772c734-etc-tuned\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.596353 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596288 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-var-lib-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596353 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596332 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-var-lib-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596353 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596375 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-kubelet\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596377 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-lib-modules\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596407 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596408 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-cni-bin\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596446 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-cni-bin\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596460 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-cni-netd\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596488 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsd6t\" (UniqueName: \"kubernetes.io/projected/a1918ea1-23d1-4627-af99-2e000c93ecfd-kube-api-access-dsd6t\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596542 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-host-slash\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596565 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-lib-modules\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596588 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-cni-netd\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.596605 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596569 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596647 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-etc-selinux\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596650 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-host-slash\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-hostroot\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596726 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-log-socket\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596770 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-log-socket\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596786 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-hostroot\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596795 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-cnibin\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596823 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysctl-conf\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596847 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-conf-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596849 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-cnibin\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596888 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-conf-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596905 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-daemon-config\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596919 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-ovn\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596946 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovnkube-config\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596956 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysctl-conf\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596969 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-ovn\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596973 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-hosts-file\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.597116 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597028 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-hosts-file\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597042 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597047 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysctl-d\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597087 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-run\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597110 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-host\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597133 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-etc-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597159 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597162 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-run\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597172 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-host\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.596807 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597190 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597260 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-etc-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597298 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f1645d7-db01-4339-bbd8-b67eb1828971-serviceca\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597333 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-cnibin\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597341 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysctl-d\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-slash\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597382 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-systemd\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.597924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597411 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-cnibin\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597431 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-iptables-alerter-script\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597451 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-systemd\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597459 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-system-cni-dir\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597475 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597494 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-systemd\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597504 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovnkube-config\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597519 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-var-lib-kubelet\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597567 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-var-lib-kubelet\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597571 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-system-cni-dir\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597593 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-slash\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597636 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-systemd\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597651 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-node-log\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597680 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-node-log\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597679 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-device-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597713 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-cni-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597740 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-k8s-cni-cncf-io\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.598800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597765 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovn-node-metrics-cert\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597787 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-cni-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597795 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-system-cni-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597739 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23b22676-e2ab-4cd8-97f3-c119a27160e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597820 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-os-release\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597844 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-os-release\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597886 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-system-cni-dir\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597872 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-socket-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597919 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjq6c\" (UniqueName: \"kubernetes.io/projected/2f1645d7-db01-4339-bbd8-b67eb1828971-kube-api-access-bjq6c\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597946 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-sys\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597970 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597998 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598024 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vr85w\" (UniqueName: \"kubernetes.io/projected/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-kube-api-access-vr85w\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598068 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6r2\" (UniqueName: \"kubernetes.io/projected/23b22676-e2ab-4cd8-97f3-c119a27160e7-kube-api-access-nq6r2\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598094 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/813a2f60-88c4-4200-a499-4d307772c734-tmp\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598115 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22314c26-39e2-493f-a8fc-95107d7fe18b-cni-binary-copy\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598139 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-kubelet\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.599673 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598167 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9179cf17-8c50-479c-8b4c-bb3b4144c5ec-konnectivity-ca\") pod \"konnectivity-agent-tk5jk\" (UID: \"9179cf17-8c50-479c-8b4c-bb3b4144c5ec\") " pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598215 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-run-netns\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598238 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-cni-bin\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598292 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x6l4\" (UniqueName: \"kubernetes.io/projected/ffaace54-8d28-433c-b3bf-e5664064b07e-kube-api-access-4x6l4\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598322 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysconfig\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598348 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-socket-dir-parent\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598372 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9179cf17-8c50-479c-8b4c-bb3b4144c5ec-agent-certs\") pod \"konnectivity-agent-tk5jk\" (UID: \"9179cf17-8c50-479c-8b4c-bb3b4144c5ec\") " pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598396 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-env-overrides\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598423 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgqkm\" (UniqueName: \"kubernetes.io/projected/1539f28a-ecb8-4e87-801a-55d98a284306-kube-api-access-bgqkm\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598428 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-run-openvswitch\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598434 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598447 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-kubernetes\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598502 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-kubernetes\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598548 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-iptables-alerter-script\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598610 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-run-netns\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598629 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-os-release\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598654 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-host-cni-bin\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.600421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598665 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-cni-multus\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598694 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-multus-socket-dir-parent\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598717 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-etc-kubernetes\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598296 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-kubelet\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598746 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-systemd-units\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598772 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-sysconfig\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598795 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovnkube-script-lib\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598827 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-etc-kubernetes\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598881 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-sys\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598916 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-var-lib-cni-multus\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.598960 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1918ea1-23d1-4627-af99-2e000c93ecfd-systemd-units\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.597845 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-k8s-cni-cncf-io\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599037 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/22314c26-39e2-493f-a8fc-95107d7fe18b-cni-binary-copy\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599041 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-os-release\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599099 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9179cf17-8c50-479c-8b4c-bb3b4144c5ec-konnectivity-ca\") pod \"konnectivity-agent-tk5jk\" (UID: \"9179cf17-8c50-479c-8b4c-bb3b4144c5ec\") " pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599179 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-env-overrides\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599299 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc8fw\" (UniqueName: \"kubernetes.io/projected/22314c26-39e2-493f-a8fc-95107d7fe18b-kube-api-access-wc8fw\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599330 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.601147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599592 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599604 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovnkube-script-lib\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599625 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-registration-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599754 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23b22676-e2ab-4cd8-97f3-c119a27160e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599929 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.599982 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-modprobe-d\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600120 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/813a2f60-88c4-4200-a499-4d307772c734-etc-modprobe-d\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600165 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkg2s\" (UniqueName: \"kubernetes.io/projected/813a2f60-88c4-4200-a499-4d307772c734-kube-api-access-kkg2s\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600219 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-netns\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600262 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-multus-certs\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600287 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-tmp-dir\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600326 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlq5w\" (UniqueName: \"kubernetes.io/projected/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-kube-api-access-jlq5w\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600353 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-sys-fs\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600377 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f1645d7-db01-4339-bbd8-b67eb1828971-host\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600708 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-tmp-dir\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-netns\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.600791 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/22314c26-39e2-493f-a8fc-95107d7fe18b-host-run-multus-certs\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.601784 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.601350 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/813a2f60-88c4-4200-a499-4d307772c734-tmp\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.602340 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.601417 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/813a2f60-88c4-4200-a499-4d307772c734-etc-tuned\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.602340 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.601547 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1918ea1-23d1-4627-af99-2e000c93ecfd-ovn-node-metrics-cert\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.602531 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.602513 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9179cf17-8c50-479c-8b4c-bb3b4144c5ec-agent-certs\") pod \"konnectivity-agent-tk5jk\" (UID: \"9179cf17-8c50-479c-8b4c-bb3b4144c5ec\") " pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:39.612139 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:39.612119 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:39.612139 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:39.612136 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:39.612283 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:39.612146 2565 projected.go:194] Error preparing data for projected volume kube-api-access-2wl9k for pod openshift-network-diagnostics/network-check-target-lgf7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:39.612283 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:39.612211 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k podName:01f08b4f-503c-494e-836c-e58cbfde457a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:40.112185805 +0000 UTC m=+3.127870199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2wl9k" (UniqueName: "kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k") pod "network-check-target-lgf7j" (UID: "01f08b4f-503c-494e-836c-e58cbfde457a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:39.616809 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.616790 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc8fw\" (UniqueName: \"kubernetes.io/projected/22314c26-39e2-493f-a8fc-95107d7fe18b-kube-api-access-wc8fw\") pod \"multus-pmbqp\" (UID: \"22314c26-39e2-493f-a8fc-95107d7fe18b\") " pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.618342 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.618323 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6r2\" (UniqueName: \"kubernetes.io/projected/23b22676-e2ab-4cd8-97f3-c119a27160e7-kube-api-access-nq6r2\") pod \"multus-additional-cni-plugins-swjrh\" (UID: \"23b22676-e2ab-4cd8-97f3-c119a27160e7\") " pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.621270 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.621252 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkg2s\" (UniqueName: \"kubernetes.io/projected/813a2f60-88c4-4200-a499-4d307772c734-kube-api-access-kkg2s\") pod \"tuned-lsvxp\" (UID: \"813a2f60-88c4-4200-a499-4d307772c734\") " pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.624520 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.624498 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr85w\" (UniqueName: \"kubernetes.io/projected/6ec7071c-10cb-4085-847c-cb8ce4b31cb9-kube-api-access-vr85w\") pod \"iptables-alerter-gkgwr\" (UID: \"6ec7071c-10cb-4085-847c-cb8ce4b31cb9\") " pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.624700 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.624683 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsd6t\" (UniqueName: \"kubernetes.io/projected/a1918ea1-23d1-4627-af99-2e000c93ecfd-kube-api-access-dsd6t\") pod \"ovnkube-node-sm54g\" (UID: \"a1918ea1-23d1-4627-af99-2e000c93ecfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.627731 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.627713 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlq5w\" (UniqueName: \"kubernetes.io/projected/eed2cad7-a1f7-4e91-826b-b6ca9587c1c9-kube-api-access-jlq5w\") pod \"node-resolver-ztc7p\" (UID: \"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9\") " pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.701298 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701263 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-etc-selinux\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701298 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701301 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701465 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701319 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f1645d7-db01-4339-bbd8-b67eb1828971-serviceca\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.701465 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701374 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701465 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701393 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-etc-selinux\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701465 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701394 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-device-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701465 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701454 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-socket-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701472 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjq6c\" (UniqueName: \"kubernetes.io/projected/2f1645d7-db01-4339-bbd8-b67eb1828971-kube-api-access-bjq6c\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-device-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701504 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4x6l4\" (UniqueName: \"kubernetes.io/projected/ffaace54-8d28-433c-b3bf-e5664064b07e-kube-api-access-4x6l4\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701532 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgqkm\" (UniqueName: \"kubernetes.io/projected/1539f28a-ecb8-4e87-801a-55d98a284306-kube-api-access-bgqkm\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701561 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-socket-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701627 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-registration-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701657 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701698 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-sys-fs\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:39.701727 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:39.701749 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701736 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f1645d7-db01-4339-bbd8-b67eb1828971-host\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.702066 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701761 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2f1645d7-db01-4339-bbd8-b67eb1828971-serviceca\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.702066 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701765 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-registration-dir\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.702066 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701789 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1539f28a-ecb8-4e87-801a-55d98a284306-sys-fs\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.702066 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:39.701797 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:29:40.201779431 +0000 UTC m=+3.217463829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:39.702066 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.701808 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f1645d7-db01-4339-bbd8-b67eb1828971-host\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.713259 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.713196 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x6l4\" (UniqueName: \"kubernetes.io/projected/ffaace54-8d28-433c-b3bf-e5664064b07e-kube-api-access-4x6l4\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:39.713935 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.713919 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjq6c\" (UniqueName: \"kubernetes.io/projected/2f1645d7-db01-4339-bbd8-b67eb1828971-kube-api-access-bjq6c\") pod \"node-ca-c2mrs\" (UID: \"2f1645d7-db01-4339-bbd8-b67eb1828971\") " pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:39.714017 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.713946 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgqkm\" (UniqueName: \"kubernetes.io/projected/1539f28a-ecb8-4e87-801a-55d98a284306-kube-api-access-bgqkm\") pod \"aws-ebs-csi-driver-node-m8sfx\" (UID: \"1539f28a-ecb8-4e87-801a-55d98a284306\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.782865 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.782836 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gkgwr" Apr 24 22:29:39.790667 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.790645 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ztc7p" Apr 24 22:29:39.799090 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.799069 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" Apr 24 22:29:39.805831 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.805810 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-swjrh" Apr 24 22:29:39.812449 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.812428 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pmbqp" Apr 24 22:29:39.819981 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.819956 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:39.827523 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.827498 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:29:39.837081 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.837064 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" Apr 24 22:29:39.842569 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:39.842550 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-c2mrs" Apr 24 22:29:40.204754 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.204725 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:40.204924 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.204761 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:40.204924 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:40.204867 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:40.204924 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:40.204876 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:40.204924 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:40.204892 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:40.204924 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:40.204901 2565 projected.go:194] Error preparing data for projected volume kube-api-access-2wl9k for pod openshift-network-diagnostics/network-check-target-lgf7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:40.204924 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:40.204928 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:29:41.204910052 +0000 UTC m=+4.220594428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:40.205257 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:40.204947 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k podName:01f08b4f-503c-494e-836c-e58cbfde457a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:41.204936447 +0000 UTC m=+4.220620825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2wl9k" (UniqueName: "kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k") pod "network-check-target-lgf7j" (UID: "01f08b4f-503c-494e-836c-e58cbfde457a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:40.257704 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.257671 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1645d7_db01_4339_bbd8_b67eb1828971.slice/crio-6e0f8237d0217055c448a3c44acaa5c7f71780c7b75d0be4ebce29d9cdeadfda WatchSource:0}: Error finding container 6e0f8237d0217055c448a3c44acaa5c7f71780c7b75d0be4ebce29d9cdeadfda: Status 404 returned error can't find the container with id 6e0f8237d0217055c448a3c44acaa5c7f71780c7b75d0be4ebce29d9cdeadfda Apr 24 22:29:40.258366 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.258339 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22314c26_39e2_493f_a8fc_95107d7fe18b.slice/crio-39748be1250f2b077574989d77a339588a77afaaadf9dea44f4a4b7bbb2a1e50 WatchSource:0}: Error finding container 39748be1250f2b077574989d77a339588a77afaaadf9dea44f4a4b7bbb2a1e50: Status 404 returned error can't find the container with id 39748be1250f2b077574989d77a339588a77afaaadf9dea44f4a4b7bbb2a1e50 Apr 24 22:29:40.259716 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.259693 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod813a2f60_88c4_4200_a499_4d307772c734.slice/crio-1a1973951038188f0e513b4511860cc255d39457df22803abe87463d54709f8a WatchSource:0}: Error finding container 1a1973951038188f0e513b4511860cc255d39457df22803abe87463d54709f8a: Status 404 returned error can't find the container with id 1a1973951038188f0e513b4511860cc255d39457df22803abe87463d54709f8a Apr 24 22:29:40.260491 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.260473 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1539f28a_ecb8_4e87_801a_55d98a284306.slice/crio-0e6f5b51deaae156f855cd9a9f01409bf0ffa1a8a35f2e1e76cf29f5eea55836 WatchSource:0}: Error finding container 0e6f5b51deaae156f855cd9a9f01409bf0ffa1a8a35f2e1e76cf29f5eea55836: Status 404 returned error can't find the container with id 0e6f5b51deaae156f855cd9a9f01409bf0ffa1a8a35f2e1e76cf29f5eea55836 Apr 24 22:29:40.263225 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.263203 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1918ea1_23d1_4627_af99_2e000c93ecfd.slice/crio-01d701868e086fe059a4938eeaa9f22abab3fe5ad9d8107ee50bab67aca92617 WatchSource:0}: Error finding container 01d701868e086fe059a4938eeaa9f22abab3fe5ad9d8107ee50bab67aca92617: Status 404 returned error can't find the container with id 01d701868e086fe059a4938eeaa9f22abab3fe5ad9d8107ee50bab67aca92617 Apr 24 22:29:40.264379 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.264203 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9179cf17_8c50_479c_8b4c_bb3b4144c5ec.slice/crio-165eea9bee5b93c373c13944c7cb0b65aa406f222498629b28c02056ebdc9e46 WatchSource:0}: Error finding container 165eea9bee5b93c373c13944c7cb0b65aa406f222498629b28c02056ebdc9e46: Status 404 returned error can't find the container with id 165eea9bee5b93c373c13944c7cb0b65aa406f222498629b28c02056ebdc9e46 Apr 24 22:29:40.264990 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.264912 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ec7071c_10cb_4085_847c_cb8ce4b31cb9.slice/crio-7c6fe9dcba39c697af0399f73d1e880ecfedf68bb5b268572b684acd130db633 WatchSource:0}: Error finding container 7c6fe9dcba39c697af0399f73d1e880ecfedf68bb5b268572b684acd130db633: Status 404 returned error can't find the container with id 7c6fe9dcba39c697af0399f73d1e880ecfedf68bb5b268572b684acd130db633 Apr 24 22:29:40.266290 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.266264 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeed2cad7_a1f7_4e91_826b_b6ca9587c1c9.slice/crio-95b0962af41d4f73dadaea33a466064045c02f77bc2e95d2a426e031c721e25e WatchSource:0}: Error finding container 95b0962af41d4f73dadaea33a466064045c02f77bc2e95d2a426e031c721e25e: Status 404 returned error can't find the container with id 95b0962af41d4f73dadaea33a466064045c02f77bc2e95d2a426e031c721e25e Apr 24 22:29:40.266933 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:29:40.266910 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b22676_e2ab_4cd8_97f3_c119a27160e7.slice/crio-91da33a3af10afc073028960943bff074c0258be8cc6367ecf822b754a54f02c WatchSource:0}: Error finding container 91da33a3af10afc073028960943bff074c0258be8cc6367ecf822b754a54f02c: Status 404 returned error can't find the container with id 91da33a3af10afc073028960943bff074c0258be8cc6367ecf822b754a54f02c Apr 24 22:29:40.520510 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.520469 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 22:24:38 +0000 UTC" deadline="2027-10-25 19:21:25.912501734 +0000 UTC" Apr 24 22:29:40.520510 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.520506 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13172h51m45.391999255s" Apr 24 22:29:40.579081 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.579034 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-c2mrs" event={"ID":"2f1645d7-db01-4339-bbd8-b67eb1828971","Type":"ContainerStarted","Data":"6e0f8237d0217055c448a3c44acaa5c7f71780c7b75d0be4ebce29d9cdeadfda"} Apr 24 22:29:40.580106 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.580079 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerStarted","Data":"91da33a3af10afc073028960943bff074c0258be8cc6367ecf822b754a54f02c"} Apr 24 22:29:40.581282 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.581250 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tk5jk" event={"ID":"9179cf17-8c50-479c-8b4c-bb3b4144c5ec","Type":"ContainerStarted","Data":"165eea9bee5b93c373c13944c7cb0b65aa406f222498629b28c02056ebdc9e46"} Apr 24 22:29:40.582299 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.582265 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gkgwr" event={"ID":"6ec7071c-10cb-4085-847c-cb8ce4b31cb9","Type":"ContainerStarted","Data":"7c6fe9dcba39c697af0399f73d1e880ecfedf68bb5b268572b684acd130db633"} Apr 24 22:29:40.583365 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.583330 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmbqp" event={"ID":"22314c26-39e2-493f-a8fc-95107d7fe18b","Type":"ContainerStarted","Data":"39748be1250f2b077574989d77a339588a77afaaadf9dea44f4a4b7bbb2a1e50"} Apr 24 22:29:40.584367 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.584341 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" event={"ID":"813a2f60-88c4-4200-a499-4d307772c734","Type":"ContainerStarted","Data":"1a1973951038188f0e513b4511860cc255d39457df22803abe87463d54709f8a"} Apr 24 22:29:40.586176 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.586152 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" event={"ID":"90b166908a0de8e6af429a39155ebb21","Type":"ContainerStarted","Data":"74e6a70ca5f0b2237d5ff4350c6963237684b81de7d5b37464910fc52f430839"} Apr 24 22:29:40.587378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.587352 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ztc7p" event={"ID":"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9","Type":"ContainerStarted","Data":"95b0962af41d4f73dadaea33a466064045c02f77bc2e95d2a426e031c721e25e"} Apr 24 22:29:40.588476 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.588454 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"01d701868e086fe059a4938eeaa9f22abab3fe5ad9d8107ee50bab67aca92617"} Apr 24 22:29:40.590517 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.590486 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" event={"ID":"1539f28a-ecb8-4e87-801a-55d98a284306","Type":"ContainerStarted","Data":"0e6f5b51deaae156f855cd9a9f01409bf0ffa1a8a35f2e1e76cf29f5eea55836"} Apr 24 22:29:40.599558 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:40.599320 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-202.ec2.internal" podStartSLOduration=1.599304612 podStartE2EDuration="1.599304612s" podCreationTimestamp="2026-04-24 22:29:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:40.59865106 +0000 UTC m=+3.614335472" watchObservedRunningTime="2026-04-24 22:29:40.599304612 +0000 UTC m=+3.614989010" Apr 24 22:29:41.213688 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:41.213413 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:41.213868 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:41.213706 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:41.213868 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:41.213855 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:41.213983 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:41.213916 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:29:43.213898623 +0000 UTC m=+6.229583002 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:41.214337 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:41.214317 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:41.214337 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:41.214340 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:41.214494 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:41.214353 2565 projected.go:194] Error preparing data for projected volume kube-api-access-2wl9k for pod openshift-network-diagnostics/network-check-target-lgf7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:41.214494 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:41.214395 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k podName:01f08b4f-503c-494e-836c-e58cbfde457a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:43.214380795 +0000 UTC m=+6.230065176 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2wl9k" (UniqueName: "kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k") pod "network-check-target-lgf7j" (UID: "01f08b4f-503c-494e-836c-e58cbfde457a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:41.572847 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:41.572734 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:41.572847 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:41.572780 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:41.573331 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:41.572874 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:41.573331 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:41.572959 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:41.596587 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:41.596541 2565 generic.go:358] "Generic (PLEG): container finished" podID="9cfb93022e523476bc812de84da59f57" containerID="f58e72082cc34866147ce2ec4ecc724344c7f78ea18b41ab6d34614e4a42c90f" exitCode=0 Apr 24 22:29:41.597440 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:41.597413 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" event={"ID":"9cfb93022e523476bc812de84da59f57","Type":"ContainerDied","Data":"f58e72082cc34866147ce2ec4ecc724344c7f78ea18b41ab6d34614e4a42c90f"} Apr 24 22:29:42.620137 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:42.619643 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" event={"ID":"9cfb93022e523476bc812de84da59f57","Type":"ContainerStarted","Data":"098dc28aeecbbf2c12c24c93c4d0f5a38862c4a5032d82b074f39d3a56818428"} Apr 24 22:29:43.231098 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:43.231055 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:43.231271 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:43.231109 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:43.231271 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:43.231226 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:43.231271 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:43.231247 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:43.231398 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:43.231274 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:43.231398 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:43.231289 2565 projected.go:194] Error preparing data for projected volume kube-api-access-2wl9k for pod openshift-network-diagnostics/network-check-target-lgf7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:43.231398 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:43.231297 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:29:47.231275445 +0000 UTC m=+10.246959844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:43.231398 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:43.231338 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k podName:01f08b4f-503c-494e-836c-e58cbfde457a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:47.231322271 +0000 UTC m=+10.247006661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2wl9k" (UniqueName: "kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k") pod "network-check-target-lgf7j" (UID: "01f08b4f-503c-494e-836c-e58cbfde457a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:43.571165 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:43.571082 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:43.571329 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:43.571221 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:43.571610 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:43.571592 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:43.571707 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:43.571683 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:45.572013 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:45.571977 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:45.572471 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:45.572101 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:45.572637 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:45.572613 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:45.572760 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:45.572720 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:47.267237 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:47.267189 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:47.267237 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:47.267246 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:47.267837 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:47.267422 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:47.267837 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:47.267483 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:29:55.267465173 +0000 UTC m=+18.283149558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:47.268021 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:47.267896 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:47.268021 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:47.267914 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:47.268021 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:47.267928 2565 projected.go:194] Error preparing data for projected volume kube-api-access-2wl9k for pod openshift-network-diagnostics/network-check-target-lgf7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:47.268021 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:47.267987 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k podName:01f08b4f-503c-494e-836c-e58cbfde457a nodeName:}" failed. No retries permitted until 2026-04-24 22:29:55.267958888 +0000 UTC m=+18.283643282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2wl9k" (UniqueName: "kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k") pod "network-check-target-lgf7j" (UID: "01f08b4f-503c-494e-836c-e58cbfde457a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:47.573070 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:47.572608 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:47.573070 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:47.572711 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:47.573371 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:47.572567 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:47.573984 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:47.573924 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:49.571943 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:49.571913 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:49.572483 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:49.571913 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:49.572483 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:49.572031 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:49.572483 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:49.572121 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:51.072976 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.072916 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-202.ec2.internal" podStartSLOduration=13.072898841 podStartE2EDuration="13.072898841s" podCreationTimestamp="2026-04-24 22:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:29:42.641192528 +0000 UTC m=+5.656876929" watchObservedRunningTime="2026-04-24 22:29:51.072898841 +0000 UTC m=+14.088583242" Apr 24 22:29:51.073394 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.073104 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-h2lq6"] Apr 24 22:29:51.075790 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.075771 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.075911 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:51.075830 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:29:51.197427 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.197392 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1034778-383a-47c7-b317-b6284cb34a98-kubelet-config\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.197629 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.197470 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.197629 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.197504 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1034778-383a-47c7-b317-b6284cb34a98-dbus\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.298754 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.298721 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.298754 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.298766 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1034778-383a-47c7-b317-b6284cb34a98-dbus\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.298983 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.298869 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1034778-383a-47c7-b317-b6284cb34a98-kubelet-config\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.298983 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:51.298904 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:51.298983 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.298949 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1034778-383a-47c7-b317-b6284cb34a98-dbus\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.298983 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.298950 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1034778-383a-47c7-b317-b6284cb34a98-kubelet-config\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.298983 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:51.298984 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret podName:f1034778-383a-47c7-b317-b6284cb34a98 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:51.798963851 +0000 UTC m=+14.814648249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret") pod "global-pull-secret-syncer-h2lq6" (UID: "f1034778-383a-47c7-b317-b6284cb34a98") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:51.572088 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.572048 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:51.572274 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:51.572186 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:51.572274 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.572257 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:51.572388 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:51.572374 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:51.802220 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:51.802190 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:51.802455 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:51.802317 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:51.802455 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:51.802381 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret podName:f1034778-383a-47c7-b317-b6284cb34a98 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:52.8023644 +0000 UTC m=+15.818048791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret") pod "global-pull-secret-syncer-h2lq6" (UID: "f1034778-383a-47c7-b317-b6284cb34a98") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:52.571075 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:52.571040 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:52.571625 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:52.571166 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:29:52.811655 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:52.811623 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:52.811817 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:52.811770 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:52.811883 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:52.811832 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret podName:f1034778-383a-47c7-b317-b6284cb34a98 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:54.811813708 +0000 UTC m=+17.827498090 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret") pod "global-pull-secret-syncer-h2lq6" (UID: "f1034778-383a-47c7-b317-b6284cb34a98") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:53.571364 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:53.571327 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:53.572023 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:53.571465 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:53.572023 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:53.571502 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:53.572023 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:53.571597 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:54.572074 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:54.572040 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:54.572503 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:54.572168 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:29:54.823046 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:54.822963 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:54.823207 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:54.823118 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:54.823207 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:54.823180 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret podName:f1034778-383a-47c7-b317-b6284cb34a98 nodeName:}" failed. No retries permitted until 2026-04-24 22:29:58.823165099 +0000 UTC m=+21.838849481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret") pod "global-pull-secret-syncer-h2lq6" (UID: "f1034778-383a-47c7-b317-b6284cb34a98") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:55.326251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:55.326212 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:55.326449 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:55.326262 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:55.326449 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:55.326389 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:55.326449 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:55.326411 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 22:29:55.326449 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:55.326434 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 22:29:55.326449 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:55.326446 2565 projected.go:194] Error preparing data for projected volume kube-api-access-2wl9k for pod openshift-network-diagnostics/network-check-target-lgf7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:55.326665 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:55.326461 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.326442461 +0000 UTC m=+34.342126839 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 22:29:55.326665 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:55.326498 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k podName:01f08b4f-503c-494e-836c-e58cbfde457a nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.32648203 +0000 UTC m=+34.342166427 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2wl9k" (UniqueName: "kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k") pod "network-check-target-lgf7j" (UID: "01f08b4f-503c-494e-836c-e58cbfde457a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 22:29:55.571800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:55.571737 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:55.571985 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:55.571879 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:55.571985 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:55.571940 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:55.572091 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:55.572064 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:56.571234 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:56.571203 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:56.571420 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:56.571332 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:29:57.575329 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.575073 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:57.579568 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.579535 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:57.582212 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:57.579802 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:57.582212 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:57.579984 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:57.647545 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.647515 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ztc7p" event={"ID":"eed2cad7-a1f7-4e91-826b-b6ca9587c1c9","Type":"ContainerStarted","Data":"cec18b6e066a6b6de554f9d7ffcd270a7ad03be30af35b3e1da947faf7cd1d8c"} Apr 24 22:29:57.648802 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.648783 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"98d3bcabf69b464d7bbd5319b7dbbb855784028f694747587de6e581038c3727"} Apr 24 22:29:57.649866 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.649846 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-c2mrs" event={"ID":"2f1645d7-db01-4339-bbd8-b67eb1828971","Type":"ContainerStarted","Data":"0a77cb92034fb262ff1924deff967cbda17fd6f97b8ff6b240cbfcb885949858"} Apr 24 22:29:57.651198 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.651178 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tk5jk" event={"ID":"9179cf17-8c50-479c-8b4c-bb3b4144c5ec","Type":"ContainerStarted","Data":"8e460d107e33bee5d8697e29c8af52e5afc35023ab75fd4596d968466307fd6a"} Apr 24 22:29:57.652296 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.652278 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pmbqp" event={"ID":"22314c26-39e2-493f-a8fc-95107d7fe18b","Type":"ContainerStarted","Data":"f071460170d1279ff742236f6f96e37db6172a89231527d7e80f6faf2fd0f1cd"} Apr 24 22:29:57.653427 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.653409 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" event={"ID":"813a2f60-88c4-4200-a499-4d307772c734","Type":"ContainerStarted","Data":"1cd2e2940b37ec90a3339002ebd8c0925b776405f96e442fa2e6e95e71ae0b7e"} Apr 24 22:29:57.663326 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.663277 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ztc7p" podStartSLOduration=3.6555120150000002 podStartE2EDuration="20.663267241s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.268338008 +0000 UTC m=+3.284022385" lastFinishedPulling="2026-04-24 22:29:57.276093233 +0000 UTC m=+20.291777611" observedRunningTime="2026-04-24 22:29:57.66322696 +0000 UTC m=+20.678911358" watchObservedRunningTime="2026-04-24 22:29:57.663267241 +0000 UTC m=+20.678951639" Apr 24 22:29:57.701311 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.701266 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lsvxp" podStartSLOduration=3.656650928 podStartE2EDuration="20.701248533s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.261532085 +0000 UTC m=+3.277216469" lastFinishedPulling="2026-04-24 22:29:57.306129685 +0000 UTC m=+20.321814074" observedRunningTime="2026-04-24 22:29:57.682254647 +0000 UTC m=+20.697939050" watchObservedRunningTime="2026-04-24 22:29:57.701248533 +0000 UTC m=+20.716932932" Apr 24 22:29:57.730058 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.730013 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-c2mrs" podStartSLOduration=8.404232687 podStartE2EDuration="20.729994035s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.259342835 +0000 UTC m=+3.275027213" lastFinishedPulling="2026-04-24 22:29:52.585104167 +0000 UTC m=+15.600788561" observedRunningTime="2026-04-24 22:29:57.701231447 +0000 UTC m=+20.716915847" watchObservedRunningTime="2026-04-24 22:29:57.729994035 +0000 UTC m=+20.745678434" Apr 24 22:29:57.730427 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:57.730398 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pmbqp" podStartSLOduration=3.648780033 podStartE2EDuration="20.730389011s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.261187604 +0000 UTC m=+3.276871999" lastFinishedPulling="2026-04-24 22:29:57.3427966 +0000 UTC m=+20.358480977" observedRunningTime="2026-04-24 22:29:57.729848642 +0000 UTC m=+20.745533040" watchObservedRunningTime="2026-04-24 22:29:57.730389011 +0000 UTC m=+20.746073410" Apr 24 22:29:58.571883 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.571717 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:58.571992 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:58.571961 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:29:58.657300 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.657273 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:29:58.657792 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.657598 2565 generic.go:358] "Generic (PLEG): container finished" podID="a1918ea1-23d1-4627-af99-2e000c93ecfd" containerID="8d5d7389e9cf7855ab34fcea1fa5529cd179f506ce391ef9ddb73e218f2e2123" exitCode=1 Apr 24 22:29:58.657792 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.657605 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"95692a06e08b62c80b9c40f71f2681650ca2745e0a0b8221b1da859575861e09"} Apr 24 22:29:58.657792 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.657636 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"6644a436a7b034967062dd6ac1e3e20af24e55f91c6c0b499f1b595a69d74a7e"} Apr 24 22:29:58.657792 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.657645 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"10468236bcc66997fa2441888c89e0701fa034f24af25b56a6462156b235162a"} Apr 24 22:29:58.657792 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.657656 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"04f867ea447e3f4eab431412b8143606d713a6fbe10e3ad42290f8bd3bcfe97c"} Apr 24 22:29:58.657792 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.657668 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerDied","Data":"8d5d7389e9cf7855ab34fcea1fa5529cd179f506ce391ef9ddb73e218f2e2123"} Apr 24 22:29:58.658848 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.658825 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" event={"ID":"1539f28a-ecb8-4e87-801a-55d98a284306","Type":"ContainerStarted","Data":"37de5eeacbd0e6d81c7dd070eea0c2e7163100be8bce1faabcc90675b9f5fcaf"} Apr 24 22:29:58.660032 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.660012 2565 generic.go:358] "Generic (PLEG): container finished" podID="23b22676-e2ab-4cd8-97f3-c119a27160e7" containerID="c8e929987d8618fbd1aae3e22746a4f2d65f8453cef6f8d4821a1d5bc9646968" exitCode=0 Apr 24 22:29:58.660108 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.660068 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerDied","Data":"c8e929987d8618fbd1aae3e22746a4f2d65f8453cef6f8d4821a1d5bc9646968"} Apr 24 22:29:58.664511 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.664483 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gkgwr" event={"ID":"6ec7071c-10cb-4085-847c-cb8ce4b31cb9","Type":"ContainerStarted","Data":"c4104e1a58d8a69d00fb57b52804652f7ca08d41fee5ab40dcb88f4e6ee03939"} Apr 24 22:29:58.706570 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.706529 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gkgwr" podStartSLOduration=4.667308327 podStartE2EDuration="21.706516664s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.266914243 +0000 UTC m=+3.282598625" lastFinishedPulling="2026-04-24 22:29:57.306122579 +0000 UTC m=+20.321806962" observedRunningTime="2026-04-24 22:29:58.705850772 +0000 UTC m=+21.721535171" watchObservedRunningTime="2026-04-24 22:29:58.706516664 +0000 UTC m=+21.722201089" Apr 24 22:29:58.802979 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.802908 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:58.803480 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.803462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:58.821590 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.821530 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tk5jk" podStartSLOduration=4.812510954 podStartE2EDuration="21.82151829s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.267145547 +0000 UTC m=+3.282829931" lastFinishedPulling="2026-04-24 22:29:57.276152886 +0000 UTC m=+20.291837267" observedRunningTime="2026-04-24 22:29:58.722022567 +0000 UTC m=+21.737706990" watchObservedRunningTime="2026-04-24 22:29:58.82151829 +0000 UTC m=+21.837202688" Apr 24 22:29:58.855438 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:58.855404 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:29:58.855558 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:58.855541 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:58.855643 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:58.855626 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret podName:f1034778-383a-47c7-b317-b6284cb34a98 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:06.855612195 +0000 UTC m=+29.871296577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret") pod "global-pull-secret-syncer-h2lq6" (UID: "f1034778-383a-47c7-b317-b6284cb34a98") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:29:59.093231 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.093193 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 22:29:59.560760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.560667 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T22:29:59.093221916Z","UUID":"21daf82c-b1c4-4ad6-a166-8af6b2e50efa","Handler":null,"Name":"","Endpoint":""} Apr 24 22:29:59.564108 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.564079 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 22:29:59.564235 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.564115 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 22:29:59.571502 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.571475 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:29:59.571659 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:59.571604 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:29:59.572005 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.571988 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:29:59.572095 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:29:59.572076 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:29:59.669300 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.669237 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" event={"ID":"1539f28a-ecb8-4e87-801a-55d98a284306","Type":"ContainerStarted","Data":"0382a499a329e98a50d1e6fd1033e5435f2d8aba0b15941e3aab008bd66e9eb1"} Apr 24 22:29:59.670259 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.669561 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:29:59.670259 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:29:59.670195 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tk5jk" Apr 24 22:30:00.571709 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:00.571685 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:00.571821 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:00.571791 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:30:00.674032 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:00.673961 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:30:00.674562 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:00.674323 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"c3da1d7ea60ab8ebfdfbfbfafce6140704df0defdb58c37508721cf207a47a1a"} Apr 24 22:30:00.676411 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:00.676384 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" event={"ID":"1539f28a-ecb8-4e87-801a-55d98a284306","Type":"ContainerStarted","Data":"bd62d425fa005e70b7b3f6c11145bb4de345e3ae213f51f759b0080da38f7a3f"} Apr 24 22:30:00.695528 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:00.695484 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8sfx" podStartSLOduration=3.553569688 podStartE2EDuration="23.695471167s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.26298246 +0000 UTC m=+3.278666850" lastFinishedPulling="2026-04-24 22:30:00.404883936 +0000 UTC m=+23.420568329" observedRunningTime="2026-04-24 22:30:00.694633603 +0000 UTC m=+23.710317999" watchObservedRunningTime="2026-04-24 22:30:00.695471167 +0000 UTC m=+23.711155569" Apr 24 22:30:01.571662 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:01.571631 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:30:01.571662 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:01.571666 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:01.571893 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:01.571746 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:30:01.571943 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:01.571885 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:30:02.571478 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:02.571290 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:02.571882 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:02.571566 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:30:03.571101 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.571074 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:03.571298 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:03.571179 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:30:03.571298 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.571281 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:30:03.571410 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:03.571388 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:30:03.684311 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.684275 2565 generic.go:358] "Generic (PLEG): container finished" podID="23b22676-e2ab-4cd8-97f3-c119a27160e7" containerID="846f470c5a0a501669da6290166b531e8b4bd6483e960f94e093cc2036f165e2" exitCode=0 Apr 24 22:30:03.684794 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.684348 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerDied","Data":"846f470c5a0a501669da6290166b531e8b4bd6483e960f94e093cc2036f165e2"} Apr 24 22:30:03.687608 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.687592 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:30:03.687919 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.687891 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"90486bba03aa07634a46c786eb7bb9e678956a825df98bc76ec943422c7001c1"} Apr 24 22:30:03.688243 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.688223 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:30:03.688331 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.688251 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:30:03.688498 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.688484 2565 scope.go:117] "RemoveContainer" containerID="8d5d7389e9cf7855ab34fcea1fa5529cd179f506ce391ef9ddb73e218f2e2123" Apr 24 22:30:03.703124 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:03.703104 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:30:04.571825 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.571561 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:04.571975 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:04.571857 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:30:04.628942 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.628912 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h2lq6"] Apr 24 22:30:04.633618 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.633562 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9bjhd"] Apr 24 22:30:04.633726 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.633687 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:30:04.633794 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:04.633776 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:30:04.634431 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.634404 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lgf7j"] Apr 24 22:30:04.634539 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.634522 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:04.634652 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:04.634632 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:30:04.692545 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.692523 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:30:04.693032 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.692829 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" event={"ID":"a1918ea1-23d1-4627-af99-2e000c93ecfd","Type":"ContainerStarted","Data":"bf15f851f9c37615f373b71112bde7caa79239d844294c1b3f1dbbc36cbbac96"} Apr 24 22:30:04.693110 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.693037 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:30:04.694637 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.694611 2565 generic.go:358] "Generic (PLEG): container finished" podID="23b22676-e2ab-4cd8-97f3-c119a27160e7" containerID="56d42047453dd441e40c8a6e42e2c02d1b7faf3234afeadfb58ad362125c99a9" exitCode=0 Apr 24 22:30:04.694760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.694670 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerDied","Data":"56d42047453dd441e40c8a6e42e2c02d1b7faf3234afeadfb58ad362125c99a9"} Apr 24 22:30:04.694760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.694709 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:04.694837 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:04.694811 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:30:04.707781 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.707756 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:30:04.720991 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:04.720953 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" podStartSLOduration=10.613858132 podStartE2EDuration="27.720942063s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.265717701 +0000 UTC m=+3.281402084" lastFinishedPulling="2026-04-24 22:29:57.372801627 +0000 UTC m=+20.388486015" observedRunningTime="2026-04-24 22:30:04.720436038 +0000 UTC m=+27.736120436" watchObservedRunningTime="2026-04-24 22:30:04.720942063 +0000 UTC m=+27.736626461" Apr 24 22:30:05.698831 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:05.698741 2565 generic.go:358] "Generic (PLEG): container finished" podID="23b22676-e2ab-4cd8-97f3-c119a27160e7" containerID="21d50205d05a4335bfb49dfce71ceb90534e2cb8d8fe235993234775dc6d4333" exitCode=0 Apr 24 22:30:05.699229 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:05.698833 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerDied","Data":"21d50205d05a4335bfb49dfce71ceb90534e2cb8d8fe235993234775dc6d4333"} Apr 24 22:30:06.571920 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:06.571843 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:06.572126 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:06.571850 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:30:06.572126 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:06.571979 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:30:06.572126 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:06.571998 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:06.572126 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:06.572110 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:30:06.572326 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:06.572172 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:30:06.917346 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:06.917256 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:06.917878 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:06.917391 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:06.917878 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:06.917464 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret podName:f1034778-383a-47c7-b317-b6284cb34a98 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:22.917444344 +0000 UTC m=+45.933128736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret") pod "global-pull-secret-syncer-h2lq6" (UID: "f1034778-383a-47c7-b317-b6284cb34a98") : object "kube-system"/"original-pull-secret" not registered Apr 24 22:30:08.571974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:08.571940 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:08.572614 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:08.571940 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:30:08.572614 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:08.572057 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-h2lq6" podUID="f1034778-383a-47c7-b317-b6284cb34a98" Apr 24 22:30:08.572614 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:08.572067 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:08.572614 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:08.572154 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:30:08.572614 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:08.572238 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lgf7j" podUID="01f08b4f-503c-494e-836c-e58cbfde457a" Apr 24 22:30:10.272891 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.272690 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-202.ec2.internal" event="NodeReady" Apr 24 22:30:10.273349 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.272995 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 22:30:10.312099 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.311989 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj"] Apr 24 22:30:10.315826 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.315792 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-txzvf"] Apr 24 22:30:10.316794 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.315956 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:30:10.319848 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.319824 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr"] Apr 24 22:30:10.319955 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.319935 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:10.320024 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.319965 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.320081 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.320051 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.320169 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.320149 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 22:30:10.320231 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.320202 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 22:30:10.320289 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.320229 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-js584\"" Apr 24 22:30:10.322884 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.322832 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz"] Apr 24 22:30:10.323117 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.323102 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.323710 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.323683 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 22:30:10.323811 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.323784 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 22:30:10.323881 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.323821 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z8wpg\"" Apr 24 22:30:10.325265 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.325124 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 22:30:10.325992 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.325976 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-788d94b556-2c8mw"] Apr 24 22:30:10.326128 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.326114 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.328812 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.328792 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.329741 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.329720 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 22:30:10.329946 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.329926 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 22:30:10.329946 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.329943 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 22:30:10.330042 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.330006 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 22:30:10.330653 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.330624 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj"] Apr 24 22:30:10.332424 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.332338 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 22:30:10.332807 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.332664 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9rsh7\"" Apr 24 22:30:10.332919 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.332808 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 22:30:10.333159 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.333138 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 22:30:10.335810 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.335791 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-txzvf"] Apr 24 22:30:10.337682 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.337664 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr"] Apr 24 22:30:10.338493 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.338414 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz"] Apr 24 22:30:10.340695 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.340675 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 22:30:10.351892 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.351873 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-788d94b556-2c8mw"] Apr 24 22:30:10.352661 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.352639 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t4vp4"] Apr 24 22:30:10.355913 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.355892 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.357961 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.357912 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jqx86\"" Apr 24 22:30:10.357961 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.357912 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 22:30:10.358128 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.358087 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 22:30:10.370704 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.370672 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t4vp4"] Apr 24 22:30:10.441166 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441129 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/900086b9-ed8f-407e-9009-80e389ecd712-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj\" (UID: \"900086b9-ed8f-407e-9009-80e389ecd712\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:30:10.441166 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441169 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.441387 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441195 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-config-volume\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.441387 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441223 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:10.441387 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441247 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/487fa0c0-5ff1-4446-b340-3c31a158bec4-klusterlet-config\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.441387 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441290 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:10.441387 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441340 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3545f18f-79bf-4b77-a115-1f3349c4650b-ca-trust-extracted\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.441606 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441428 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.441606 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441457 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rgd9\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-kube-api-access-5rgd9\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.441606 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441488 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-ca\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.441606 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441509 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t4kj\" (UniqueName: \"kubernetes.io/projected/900086b9-ed8f-407e-9009-80e389ecd712-kube-api-access-5t4kj\") pod \"managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj\" (UID: \"900086b9-ed8f-407e-9009-80e389ecd712\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:30:10.441606 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441537 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.441606 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441563 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-trusted-ca\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.441606 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441604 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-installation-pull-secrets\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.441911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441636 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-hub\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.441911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441688 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9d447531-8a44-4047-a2b0-0d208b808c15-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.441911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441705 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6qlk\" (UniqueName: \"kubernetes.io/projected/9d447531-8a44-4047-a2b0-0d208b808c15-kube-api-access-f6qlk\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.441911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441719 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.441911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441744 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbkc8\" (UniqueName: \"kubernetes.io/projected/487fa0c0-5ff1-4446-b340-3c31a158bec4-kube-api-access-kbkc8\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.441911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441794 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp7mf\" (UniqueName: \"kubernetes.io/projected/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-kube-api-access-cp7mf\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.441911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441833 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-certificates\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.441911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441876 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-bound-sa-token\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.442208 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441931 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-tmp-dir\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.442208 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.441969 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-image-registry-private-configuration\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.442208 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.442022 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/487fa0c0-5ff1-4446-b340-3c31a158bec4-tmp\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.446590 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.446552 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7pxrr"] Apr 24 22:30:10.449663 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.449643 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:10.452375 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.452354 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.452477 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.452426 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 22:30:10.452543 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.452482 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zzr8g\"" Apr 24 22:30:10.452724 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.452710 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.466409 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.466378 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7pxrr"] Apr 24 22:30:10.543300 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543262 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-image-registry-private-configuration\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.543483 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543304 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/900086b9-ed8f-407e-9009-80e389ecd712-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj\" (UID: \"900086b9-ed8f-407e-9009-80e389ecd712\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:30:10.543483 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543333 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.543483 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543470 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-config-volume\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.543687 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543526 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:10.543687 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543590 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.543687 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543618 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rgd9\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-kube-api-access-5rgd9\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.543687 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543653 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:10.543687 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543685 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.543700 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543711 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-certificates\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543740 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9d447531-8a44-4047-a2b0-0d208b808c15-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.543770 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.043749339 +0000 UTC m=+34.059433716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543802 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-bound-sa-token\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543832 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6qlk\" (UniqueName: \"kubernetes.io/projected/9d447531-8a44-4047-a2b0-0d208b808c15-kube-api-access-f6qlk\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543871 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543902 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbkc8\" (UniqueName: \"kubernetes.io/projected/487fa0c0-5ff1-4446-b340-3c31a158bec4-kube-api-access-kbkc8\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543928 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp7mf\" (UniqueName: \"kubernetes.io/projected/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-kube-api-access-cp7mf\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.543959 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543957 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/487fa0c0-5ff1-4446-b340-3c31a158bec4-tmp\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.543988 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3545f18f-79bf-4b77-a115-1f3349c4650b-ca-trust-extracted\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544018 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/487fa0c0-5ff1-4446-b340-3c31a158bec4-klusterlet-config\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544077 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544076 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-config-volume\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544108 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnqzb\" (UniqueName: \"kubernetes.io/projected/57eedb74-e256-4612-845f-7dc838139e1f-kube-api-access-vnqzb\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.544361 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.544423 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.044405994 +0000 UTC m=+34.060090375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544427 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9d447531-8a44-4047-a2b0-0d208b808c15-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.544509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544465 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-ca\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.544967 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544715 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-certificates\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.544967 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544738 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3545f18f-79bf-4b77-a115-1f3349c4650b-ca-trust-extracted\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.544967 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544784 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/487fa0c0-5ff1-4446-b340-3c31a158bec4-tmp\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.544967 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.544790 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t4kj\" (UniqueName: \"kubernetes.io/projected/900086b9-ed8f-407e-9009-80e389ecd712-kube-api-access-5t4kj\") pod \"managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj\" (UID: \"900086b9-ed8f-407e-9009-80e389ecd712\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:30:10.545163 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.545047 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-trusted-ca\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.545163 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.545124 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:10.545163 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.545136 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:30:10.545309 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.545159 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-installation-pull-secrets\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.545309 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.545176 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.045162096 +0000 UTC m=+34.060846485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:30:10.545309 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.545217 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-hub\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.545309 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.545257 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-tmp-dir\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.545890 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.545539 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:10.545890 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.545658 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-tmp-dir\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.547049 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.547026 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-trusted-ca\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.548494 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.548450 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.548494 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.548492 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-installation-pull-secrets\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.549421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.549049 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/900086b9-ed8f-407e-9009-80e389ecd712-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj\" (UID: \"900086b9-ed8f-407e-9009-80e389ecd712\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:30:10.549421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.548730 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.549421 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.549371 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/487fa0c0-5ff1-4446-b340-3c31a158bec4-klusterlet-config\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.552371 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.552310 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-image-registry-private-configuration\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.552969 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.552944 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-ca\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.553074 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.552994 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9d447531-8a44-4047-a2b0-0d208b808c15-hub\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.557823 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.557786 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-bound-sa-token\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.560271 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.560227 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbkc8\" (UniqueName: \"kubernetes.io/projected/487fa0c0-5ff1-4446-b340-3c31a158bec4-kube-api-access-kbkc8\") pod \"klusterlet-addon-workmgr-7456857b6f-5w7wr\" (UID: \"487fa0c0-5ff1-4446-b340-3c31a158bec4\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.561493 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.561451 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rgd9\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-kube-api-access-5rgd9\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:10.561813 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.561793 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp7mf\" (UniqueName: \"kubernetes.io/projected/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-kube-api-access-cp7mf\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:10.562424 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.562370 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6qlk\" (UniqueName: \"kubernetes.io/projected/9d447531-8a44-4047-a2b0-0d208b808c15-kube-api-access-f6qlk\") pod \"cluster-proxy-proxy-agent-8689464f96-zdzfz\" (UID: \"9d447531-8a44-4047-a2b0-0d208b808c15\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:10.563088 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.563067 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t4kj\" (UniqueName: \"kubernetes.io/projected/900086b9-ed8f-407e-9009-80e389ecd712-kube-api-access-5t4kj\") pod \"managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj\" (UID: \"900086b9-ed8f-407e-9009-80e389ecd712\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:30:10.571974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.571954 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:30:10.571974 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.571969 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:10.572135 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.571990 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:10.574858 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.574664 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 22:30:10.574858 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.574841 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 22:30:10.575010 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.574814 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 22:30:10.575248 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.575229 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h92dk\"" Apr 24 22:30:10.575367 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.575345 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-cgksw\"" Apr 24 22:30:10.575850 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.575830 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 22:30:10.639402 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.639370 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:30:10.646419 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.646396 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnqzb\" (UniqueName: \"kubernetes.io/projected/57eedb74-e256-4612-845f-7dc838139e1f-kube-api-access-vnqzb\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:10.646522 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.646464 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:10.646691 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.646665 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:10.646768 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:10.646756 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:30:11.146737066 +0000 UTC m=+34.162421457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:30:10.655375 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.655352 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:10.660737 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.660714 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnqzb\" (UniqueName: \"kubernetes.io/projected/57eedb74-e256-4612-845f-7dc838139e1f-kube-api-access-vnqzb\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:10.663450 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:10.663414 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:30:11.050474 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.050432 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:11.050682 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.050565 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:11.050682 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.050616 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:11.050682 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.050640 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:11.050864 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.050697 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.050669861 +0000 UTC m=+35.066354242 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:30:11.050864 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.050722 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:11.050864 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.050743 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:11.050864 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.050755 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:30:11.050864 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.050797 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.05078043 +0000 UTC m=+35.066464816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:30:11.050864 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.050819 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.050808291 +0000 UTC m=+35.066492681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:30:11.151006 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.150972 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:11.151171 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.151125 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:11.151211 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.151186 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:30:12.151169809 +0000 UTC m=+35.166854190 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:30:11.334316 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.334238 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz"] Apr 24 22:30:11.335710 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.335691 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr"] Apr 24 22:30:11.336502 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.336484 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj"] Apr 24 22:30:11.352900 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.352875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:11.352900 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.352905 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:30:11.353063 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.353016 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:30:11.353098 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:11.353065 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:30:43.353052039 +0000 UTC m=+66.368736417 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : secret "metrics-daemon-secret" not found Apr 24 22:30:11.355781 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.355762 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wl9k\" (UniqueName: \"kubernetes.io/projected/01f08b4f-503c-494e-836c-e58cbfde457a-kube-api-access-2wl9k\") pod \"network-check-target-lgf7j\" (UID: \"01f08b4f-503c-494e-836c-e58cbfde457a\") " pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:11.373856 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:30:11.373827 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d447531_8a44_4047_a2b0_0d208b808c15.slice/crio-50c393cba9e4a65e6afede186cf387e26977a11c1400ba7c4cca1f3f15bd131a WatchSource:0}: Error finding container 50c393cba9e4a65e6afede186cf387e26977a11c1400ba7c4cca1f3f15bd131a: Status 404 returned error can't find the container with id 50c393cba9e4a65e6afede186cf387e26977a11c1400ba7c4cca1f3f15bd131a Apr 24 22:30:11.374236 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:30:11.374174 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod900086b9_ed8f_407e_9009_80e389ecd712.slice/crio-dec1d9e4208c8d570746b02684f2c34850939458edc5e87a4344604f3d6d04a1 WatchSource:0}: Error finding container dec1d9e4208c8d570746b02684f2c34850939458edc5e87a4344604f3d6d04a1: Status 404 returned error can't find the container with id dec1d9e4208c8d570746b02684f2c34850939458edc5e87a4344604f3d6d04a1 Apr 24 22:30:11.375296 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:30:11.375271 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487fa0c0_5ff1_4446_b340_3c31a158bec4.slice/crio-cc442e95d7ab18c683bab0bed8f8e84d4d46012c445e6f2768234be8c162ad4b WatchSource:0}: Error finding container cc442e95d7ab18c683bab0bed8f8e84d4d46012c445e6f2768234be8c162ad4b: Status 404 returned error can't find the container with id cc442e95d7ab18c683bab0bed8f8e84d4d46012c445e6f2768234be8c162ad4b Apr 24 22:30:11.497291 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.497265 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:11.640253 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.640049 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lgf7j"] Apr 24 22:30:11.645782 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:30:11.645752 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f08b4f_503c_494e_836c_e58cbfde457a.slice/crio-06d5754f16b57410cfc97b8c1bf1ec0313d3f64fcfa1c670d3ad0ada8185e948 WatchSource:0}: Error finding container 06d5754f16b57410cfc97b8c1bf1ec0313d3f64fcfa1c670d3ad0ada8185e948: Status 404 returned error can't find the container with id 06d5754f16b57410cfc97b8c1bf1ec0313d3f64fcfa1c670d3ad0ada8185e948 Apr 24 22:30:11.718179 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.718127 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerStarted","Data":"f489115f1372f292cd979053a3f9b5500fa5f850f069d80644fbe7fe0ad7afce"} Apr 24 22:30:11.719446 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.719314 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lgf7j" event={"ID":"01f08b4f-503c-494e-836c-e58cbfde457a","Type":"ContainerStarted","Data":"06d5754f16b57410cfc97b8c1bf1ec0313d3f64fcfa1c670d3ad0ada8185e948"} Apr 24 22:30:11.720370 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.720352 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" event={"ID":"487fa0c0-5ff1-4446-b340-3c31a158bec4","Type":"ContainerStarted","Data":"cc442e95d7ab18c683bab0bed8f8e84d4d46012c445e6f2768234be8c162ad4b"} Apr 24 22:30:11.721230 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.721214 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" event={"ID":"9d447531-8a44-4047-a2b0-0d208b808c15","Type":"ContainerStarted","Data":"50c393cba9e4a65e6afede186cf387e26977a11c1400ba7c4cca1f3f15bd131a"} Apr 24 22:30:11.722110 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:11.722093 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" event={"ID":"900086b9-ed8f-407e-9009-80e389ecd712","Type":"ContainerStarted","Data":"dec1d9e4208c8d570746b02684f2c34850939458edc5e87a4344604f3d6d04a1"} Apr 24 22:30:12.060341 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:12.059418 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:12.060341 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:12.059507 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:12.060341 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:12.059546 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:12.060341 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.059740 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:12.060341 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.059805 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.059785638 +0000 UTC m=+37.075470023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:30:12.060341 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.060239 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:12.060341 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.060290 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.06027439 +0000 UTC m=+37.075958771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:30:12.060854 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.060360 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:12.060854 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.060384 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:30:12.060854 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.060438 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.060418553 +0000 UTC m=+37.076102942 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:30:12.161077 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:12.161041 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:12.161255 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.161198 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:12.161311 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:12.161269 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:30:14.161248552 +0000 UTC m=+37.176932937 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:30:12.735800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:12.734899 2565 generic.go:358] "Generic (PLEG): container finished" podID="23b22676-e2ab-4cd8-97f3-c119a27160e7" containerID="f489115f1372f292cd979053a3f9b5500fa5f850f069d80644fbe7fe0ad7afce" exitCode=0 Apr 24 22:30:12.735800 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:12.734961 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerDied","Data":"f489115f1372f292cd979053a3f9b5500fa5f850f069d80644fbe7fe0ad7afce"} Apr 24 22:30:13.745726 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:13.745684 2565 generic.go:358] "Generic (PLEG): container finished" podID="23b22676-e2ab-4cd8-97f3-c119a27160e7" containerID="056f87dcf01fe25af7ea06e2de2e2201f3dbf21162deee94e4131b7b3a4e42cd" exitCode=0 Apr 24 22:30:13.746378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:13.745759 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerDied","Data":"056f87dcf01fe25af7ea06e2de2e2201f3dbf21162deee94e4131b7b3a4e42cd"} Apr 24 22:30:14.081771 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:14.081683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:14.081771 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:14.081762 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:14.081980 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:14.081797 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:14.081980 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.081959 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:14.082081 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.082022 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.082002703 +0000 UTC m=+41.097687084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:30:14.082386 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.082251 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:14.082386 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.082281 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:14.082386 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.082286 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:30:14.082386 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.082346 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.082322587 +0000 UTC m=+41.098006970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:30:14.082386 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.082365 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.082354609 +0000 UTC m=+41.098038994 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:30:14.183012 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:14.182967 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:14.183190 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.183106 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:14.183190 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:14.183173 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:30:18.183155117 +0000 UTC m=+41.198839506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:30:18.117108 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:18.117019 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:18.117115 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:18.117142 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.117148 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.117233 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:26.117210541 +0000 UTC m=+49.132894933 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.117262 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.117270 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.117282 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.117321 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:26.117307223 +0000 UTC m=+49.132991607 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:30:18.117545 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.117335 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:30:26.117329102 +0000 UTC m=+49.133013479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:30:18.218222 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:18.218189 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:18.218390 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.218361 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:18.218457 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:18.218446 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:30:26.218425699 +0000 UTC m=+49.234110077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:30:19.760691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.760567 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" event={"ID":"9d447531-8a44-4047-a2b0-0d208b808c15","Type":"ContainerStarted","Data":"dd29e571480bff21c84ee0e0eba3c1e18962dc4c7040d656a265b968f15e50d1"} Apr 24 22:30:19.762453 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.762422 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" event={"ID":"900086b9-ed8f-407e-9009-80e389ecd712","Type":"ContainerStarted","Data":"b76a745825aed67ba28cd9546fd160ad94032e19098f5977de8b2669d5bb7bba"} Apr 24 22:30:19.766123 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.766099 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-swjrh" event={"ID":"23b22676-e2ab-4cd8-97f3-c119a27160e7","Type":"ContainerStarted","Data":"62ba69b84b4c1229c76f62716c18ec4c3b2e5b6f540fafaee440388e6d4e26a2"} Apr 24 22:30:19.767763 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.767727 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lgf7j" event={"ID":"01f08b4f-503c-494e-836c-e58cbfde457a","Type":"ContainerStarted","Data":"ef910db56a6f4160e818dec1c36823dd456e872e6e7b1b986e888ab166e33ff3"} Apr 24 22:30:19.767887 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.767851 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:30:19.769128 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.769109 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" event={"ID":"487fa0c0-5ff1-4446-b340-3c31a158bec4","Type":"ContainerStarted","Data":"f16364010c9f75334ff206d8d9d3e4a44c9925aa908ed03873be10d972afd326"} Apr 24 22:30:19.769365 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.769328 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:19.771480 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.771462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:30:19.790613 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.790553 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" podStartSLOduration=13.024857286 podStartE2EDuration="20.79053049s" podCreationTimestamp="2026-04-24 22:29:59 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.383953004 +0000 UTC m=+34.399637394" lastFinishedPulling="2026-04-24 22:30:19.149626207 +0000 UTC m=+42.165310598" observedRunningTime="2026-04-24 22:30:19.78952472 +0000 UTC m=+42.805209121" watchObservedRunningTime="2026-04-24 22:30:19.79053049 +0000 UTC m=+42.806214899" Apr 24 22:30:19.806839 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.806787 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lgf7j" podStartSLOduration=35.294267741 podStartE2EDuration="42.806773538s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.648022586 +0000 UTC m=+34.663706966" lastFinishedPulling="2026-04-24 22:30:19.160528365 +0000 UTC m=+42.176212763" observedRunningTime="2026-04-24 22:30:19.805038866 +0000 UTC m=+42.820723278" watchObservedRunningTime="2026-04-24 22:30:19.806773538 +0000 UTC m=+42.822457915" Apr 24 22:30:19.833238 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.832872 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-swjrh" podStartSLOduration=11.695695976 podStartE2EDuration="42.832855797s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:29:40.269812353 +0000 UTC m=+3.285496738" lastFinishedPulling="2026-04-24 22:30:11.406972179 +0000 UTC m=+34.422656559" observedRunningTime="2026-04-24 22:30:19.831741351 +0000 UTC m=+42.847425752" watchObservedRunningTime="2026-04-24 22:30:19.832855797 +0000 UTC m=+42.848540194" Apr 24 22:30:19.859189 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:19.859087 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" podStartSLOduration=13.093031417 podStartE2EDuration="20.859067892s" podCreationTimestamp="2026-04-24 22:29:59 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.383665386 +0000 UTC m=+34.399349762" lastFinishedPulling="2026-04-24 22:30:19.149701856 +0000 UTC m=+42.165386237" observedRunningTime="2026-04-24 22:30:19.858147197 +0000 UTC m=+42.873831596" watchObservedRunningTime="2026-04-24 22:30:19.859067892 +0000 UTC m=+42.874752287" Apr 24 22:30:21.775931 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:21.775885 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" event={"ID":"9d447531-8a44-4047-a2b0-0d208b808c15","Type":"ContainerStarted","Data":"6135f5d94d77f7eb9936c5bce6ad7c9960bbe5d76fffa0d18ad20e2293e9541c"} Apr 24 22:30:21.775931 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:21.775930 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" event={"ID":"9d447531-8a44-4047-a2b0-0d208b808c15","Type":"ContainerStarted","Data":"5dd866be518dd3263e78afc8dbbe07eeebce3e3ab4c0dfb790e1361ff0433bda"} Apr 24 22:30:21.811198 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:21.811152 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" podStartSLOduration=12.90454693 podStartE2EDuration="22.811138215s" podCreationTimestamp="2026-04-24 22:29:59 +0000 UTC" firstStartedPulling="2026-04-24 22:30:11.383669531 +0000 UTC m=+34.399353917" lastFinishedPulling="2026-04-24 22:30:21.290260808 +0000 UTC m=+44.305945202" observedRunningTime="2026-04-24 22:30:21.810029055 +0000 UTC m=+44.825713453" watchObservedRunningTime="2026-04-24 22:30:21.811138215 +0000 UTC m=+44.826822613" Apr 24 22:30:22.956345 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:22.956254 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:22.959456 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:22.959435 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1034778-383a-47c7-b317-b6284cb34a98-original-pull-secret\") pod \"global-pull-secret-syncer-h2lq6\" (UID: \"f1034778-383a-47c7-b317-b6284cb34a98\") " pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:23.192713 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:23.192675 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-h2lq6" Apr 24 22:30:23.313978 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:23.313945 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-h2lq6"] Apr 24 22:30:23.318060 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:30:23.318036 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1034778_383a_47c7_b317_b6284cb34a98.slice/crio-7a0cffe4773421544665d9a09243f0f69d6319735459c3012cbd6cd340799eaf WatchSource:0}: Error finding container 7a0cffe4773421544665d9a09243f0f69d6319735459c3012cbd6cd340799eaf: Status 404 returned error can't find the container with id 7a0cffe4773421544665d9a09243f0f69d6319735459c3012cbd6cd340799eaf Apr 24 22:30:23.784061 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:23.784013 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h2lq6" event={"ID":"f1034778-383a-47c7-b317-b6284cb34a98","Type":"ContainerStarted","Data":"7a0cffe4773421544665d9a09243f0f69d6319735459c3012cbd6cd340799eaf"} Apr 24 22:30:26.182522 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:26.182467 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:26.182553 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:26.182610 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.182665 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.182735 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.182756 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.182768 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:30:42.182744685 +0000 UTC m=+65.198429110 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.182773 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.182822 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:30:42.182804625 +0000 UTC m=+65.198489022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:30:26.183047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.182841 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:30:42.182831557 +0000 UTC m=+65.198515938 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:30:26.283762 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:26.283712 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:26.283939 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.283874 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:26.283998 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:26.283944 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:30:42.283926639 +0000 UTC m=+65.299611026 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:30:27.796255 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:27.796217 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-h2lq6" event={"ID":"f1034778-383a-47c7-b317-b6284cb34a98","Type":"ContainerStarted","Data":"bfd31d8c7e6bb03f0373a7ead8b77957159cf9d1aa57a08232b3d4b32053f43e"} Apr 24 22:30:36.714909 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:36.714876 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sm54g" Apr 24 22:30:36.755677 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:36.755631 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-h2lq6" podStartSLOduration=42.328316953 podStartE2EDuration="45.755617699s" podCreationTimestamp="2026-04-24 22:29:51 +0000 UTC" firstStartedPulling="2026-04-24 22:30:23.320106224 +0000 UTC m=+46.335790605" lastFinishedPulling="2026-04-24 22:30:26.747406959 +0000 UTC m=+49.763091351" observedRunningTime="2026-04-24 22:30:27.818918209 +0000 UTC m=+50.834602620" watchObservedRunningTime="2026-04-24 22:30:36.755617699 +0000 UTC m=+59.771302092" Apr 24 22:30:42.204666 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:42.204619 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:42.204691 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:42.204724 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.204775 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.204804 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.204834 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.204853 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.204844 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:31:14.204825462 +0000 UTC m=+97.220509847 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.204903 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:31:14.204889439 +0000 UTC m=+97.220573820 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:30:42.205047 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.204919 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:31:14.204913461 +0000 UTC m=+97.220597838 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:30:42.305339 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:42.305300 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:30:42.305500 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.305451 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:30:42.305539 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:42.305521 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:31:14.305500413 +0000 UTC m=+97.321184795 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:30:43.412501 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:43.412462 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:30:43.412905 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:43.412572 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:30:43.412905 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:30:43.412638 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:31:47.412624568 +0000 UTC m=+130.428308945 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : secret "metrics-daemon-secret" not found Apr 24 22:30:50.773054 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:30:50.773019 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lgf7j" Apr 24 22:31:14.252414 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:31:14.252371 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:31:14.252425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:31:14.252497 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.252545 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.252572 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.252609 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.252642 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.252663 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.252642259 +0000 UTC m=+161.268326648 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.252680 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.252672152 +0000 UTC m=+161.268356530 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:31:14.252931 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.252710 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.252692679 +0000 UTC m=+161.268377069 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:31:14.353068 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:31:14.353038 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:31:14.353230 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.353188 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:31:14.353270 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:14.353246 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:32:18.353230383 +0000 UTC m=+161.368914766 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:31:47.501179 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:31:47.501130 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:31:47.501717 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:47.501277 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 22:31:47.501717 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:31:47.501355 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs podName:ffaace54-8d28-433c-b3bf-e5664064b07e nodeName:}" failed. No retries permitted until 2026-04-24 22:33:49.501339772 +0000 UTC m=+252.517024153 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs") pod "network-metrics-daemon-9bjhd" (UID: "ffaace54-8d28-433c-b3bf-e5664064b07e") : secret "metrics-daemon-secret" not found Apr 24 22:32:13.347665 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:13.347617 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" podUID="2f6909d3-6ce8-4b0c-986f-82f40f5d2330" Apr 24 22:32:13.380817 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:13.380785 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-t4vp4" podUID="a30d41a7-8c4f-4b0c-9cc0-a92a394596fe" Apr 24 22:32:13.385916 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:13.385898 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-788d94b556-2c8mw" podUID="3545f18f-79bf-4b77-a115-1f3349c4650b" Apr 24 22:32:13.459001 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:13.458967 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7pxrr" podUID="57eedb74-e256-4612-845f-7dc838139e1f" Apr 24 22:32:13.584212 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:13.584179 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9bjhd" podUID="ffaace54-8d28-433c-b3bf-e5664064b07e" Apr 24 22:32:14.035255 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:14.035219 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:32:14.035255 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:14.035250 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:32:14.035508 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:14.035358 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:32:14.035508 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:14.035474 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4vp4" Apr 24 22:32:18.327934 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:18.327891 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:18.327958 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") pod \"image-registry-788d94b556-2c8mw\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:18.327990 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.328046 2565 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.328081 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.328119 2565 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.328131 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert podName:2f6909d3-6ce8-4b0c-986f-82f40f5d2330 nodeName:}" failed. No retries permitted until 2026-04-24 22:34:20.328112447 +0000 UTC m=+283.343796831 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-txzvf" (UID: "2f6909d3-6ce8-4b0c-986f-82f40f5d2330") : secret "networking-console-plugin-cert" not found Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.328135 2565 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-788d94b556-2c8mw: secret "image-registry-tls" not found Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.328147 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls podName:a30d41a7-8c4f-4b0c-9cc0-a92a394596fe nodeName:}" failed. No retries permitted until 2026-04-24 22:34:20.328140572 +0000 UTC m=+283.343824949 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls") pod "dns-default-t4vp4" (UID: "a30d41a7-8c4f-4b0c-9cc0-a92a394596fe") : secret "dns-default-metrics-tls" not found Apr 24 22:32:18.328329 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.328187 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls podName:3545f18f-79bf-4b77-a115-1f3349c4650b nodeName:}" failed. No retries permitted until 2026-04-24 22:34:20.328173334 +0000 UTC m=+283.343857716 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls") pod "image-registry-788d94b556-2c8mw" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b") : secret "image-registry-tls" not found Apr 24 22:32:18.429098 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:18.429067 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:32:18.429225 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.429207 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 22:32:18.429276 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:32:18.429267 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert podName:57eedb74-e256-4612-845f-7dc838139e1f nodeName:}" failed. No retries permitted until 2026-04-24 22:34:20.429251573 +0000 UTC m=+283.444935955 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert") pod "ingress-canary-7pxrr" (UID: "57eedb74-e256-4612-845f-7dc838139e1f") : secret "canary-serving-cert" not found Apr 24 22:32:19.769914 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:19.769852 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" podUID="487fa0c0-5ff1-4446-b340-3c31a158bec4" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 24 22:32:20.049322 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:20.049244 2565 generic.go:358] "Generic (PLEG): container finished" podID="487fa0c0-5ff1-4446-b340-3c31a158bec4" containerID="f16364010c9f75334ff206d8d9d3e4a44c9925aa908ed03873be10d972afd326" exitCode=1 Apr 24 22:32:20.049457 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:20.049319 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" event={"ID":"487fa0c0-5ff1-4446-b340-3c31a158bec4","Type":"ContainerDied","Data":"f16364010c9f75334ff206d8d9d3e4a44c9925aa908ed03873be10d972afd326"} Apr 24 22:32:20.049681 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:20.049662 2565 scope.go:117] "RemoveContainer" containerID="f16364010c9f75334ff206d8d9d3e4a44c9925aa908ed03873be10d972afd326" Apr 24 22:32:20.050558 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:20.050536 2565 generic.go:358] "Generic (PLEG): container finished" podID="900086b9-ed8f-407e-9009-80e389ecd712" containerID="b76a745825aed67ba28cd9546fd160ad94032e19098f5977de8b2669d5bb7bba" exitCode=255 Apr 24 22:32:20.050646 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:20.050568 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" event={"ID":"900086b9-ed8f-407e-9009-80e389ecd712","Type":"ContainerDied","Data":"b76a745825aed67ba28cd9546fd160ad94032e19098f5977de8b2669d5bb7bba"} Apr 24 22:32:20.050855 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:20.050842 2565 scope.go:117] "RemoveContainer" containerID="b76a745825aed67ba28cd9546fd160ad94032e19098f5977de8b2669d5bb7bba" Apr 24 22:32:20.640132 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:20.640090 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" Apr 24 22:32:20.656381 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:20.656350 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:32:21.054592 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:21.054546 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" event={"ID":"487fa0c0-5ff1-4446-b340-3c31a158bec4","Type":"ContainerStarted","Data":"ba8af4489b67581d6f175fb0180e301297c29b93e6dbba21d36f6a8bde69da05"} Apr 24 22:32:21.055021 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:21.054777 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:32:21.055742 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:21.055722 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7456857b6f-5w7wr" Apr 24 22:32:21.056256 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:21.056224 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8494d8c4b7-dkgzj" event={"ID":"900086b9-ed8f-407e-9009-80e389ecd712","Type":"ContainerStarted","Data":"916d66d1110061e2045658a643162cff28a413cd5b402a4bae48b8784a4bb439"} Apr 24 22:32:22.360496 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:22.360426 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ztc7p_eed2cad7-a1f7-4e91-826b-b6ca9587c1c9/dns-node-resolver/0.log" Apr 24 22:32:22.960325 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:22.960298 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-c2mrs_2f1645d7-db01-4339-bbd8-b67eb1828971/node-ca/0.log" Apr 24 22:32:25.571240 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:25.571203 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:32:41.805179 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.805127 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mmbbr"] Apr 24 22:32:41.808191 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.808176 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:41.810539 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.810519 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 22:32:41.810847 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.810826 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 22:32:41.810897 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.810829 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 22:32:41.810897 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.810833 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 22:32:41.811152 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.811139 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-f6lt5\"" Apr 24 22:32:41.816358 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.816336 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mmbbr"] Apr 24 22:32:41.904358 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.904323 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-data-volume\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:41.904501 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.904370 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-crio-socket\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:41.904501 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.904388 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:41.904501 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.904414 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:41.904501 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:41.904456 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7dn\" (UniqueName: \"kubernetes.io/projected/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-kube-api-access-4f7dn\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.005344 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.005314 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-data-volume\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.005344 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.005347 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-crio-socket\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.005531 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.005370 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.005531 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.005447 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-crio-socket\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.005531 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.005479 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.005670 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.005534 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7dn\" (UniqueName: \"kubernetes.io/projected/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-kube-api-access-4f7dn\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.005723 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.005706 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-data-volume\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.006051 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.006028 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.007663 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.007646 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.013677 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.013658 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7dn\" (UniqueName: \"kubernetes.io/projected/c41cb4fe-11d2-4017-a2e0-a3ba6697dc85-kube-api-access-4f7dn\") pod \"insights-runtime-extractor-mmbbr\" (UID: \"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85\") " pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.117283 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.117223 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mmbbr" Apr 24 22:32:42.234259 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:42.234232 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mmbbr"] Apr 24 22:32:42.237464 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:32:42.237437 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41cb4fe_11d2_4017_a2e0_a3ba6697dc85.slice/crio-fa3ff7dd52adeddd4a074e32e0cc4ec77210cbee1d677ce99a8ef9a16a979995 WatchSource:0}: Error finding container fa3ff7dd52adeddd4a074e32e0cc4ec77210cbee1d677ce99a8ef9a16a979995: Status 404 returned error can't find the container with id fa3ff7dd52adeddd4a074e32e0cc4ec77210cbee1d677ce99a8ef9a16a979995 Apr 24 22:32:43.107746 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:43.107661 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mmbbr" event={"ID":"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85","Type":"ContainerStarted","Data":"66a9148732d3914e67e4f2f8b09c6c657a24bcf89eef894f398c9e3fd9355c25"} Apr 24 22:32:43.107746 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:43.107701 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mmbbr" event={"ID":"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85","Type":"ContainerStarted","Data":"1a2c66a5112d002f81718feb9d73f8d4b4c7b7968aa24191b8fe1ecf609ced8f"} Apr 24 22:32:43.107746 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:43.107717 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mmbbr" event={"ID":"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85","Type":"ContainerStarted","Data":"fa3ff7dd52adeddd4a074e32e0cc4ec77210cbee1d677ce99a8ef9a16a979995"} Apr 24 22:32:45.113856 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:45.113815 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mmbbr" event={"ID":"c41cb4fe-11d2-4017-a2e0-a3ba6697dc85","Type":"ContainerStarted","Data":"14213547d181f23d784fd327917d0488a9c289bb3efce3928702ccbc3e066d3d"} Apr 24 22:32:45.134311 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:45.134259 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mmbbr" podStartSLOduration=2.200973548 podStartE2EDuration="4.134243811s" podCreationTimestamp="2026-04-24 22:32:41 +0000 UTC" firstStartedPulling="2026-04-24 22:32:42.288823516 +0000 UTC m=+185.304507896" lastFinishedPulling="2026-04-24 22:32:44.222093777 +0000 UTC m=+187.237778159" observedRunningTime="2026-04-24 22:32:45.133804852 +0000 UTC m=+188.149489264" watchObservedRunningTime="2026-04-24 22:32:45.134243811 +0000 UTC m=+188.149928212" Apr 24 22:32:50.664446 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:32:50.664403 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" podUID="9d447531-8a44-4047-a2b0-0d208b808c15" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:33:00.665134 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:00.665092 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" podUID="9d447531-8a44-4047-a2b0-0d208b808c15" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:33:02.562665 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.562636 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-x7xzg"] Apr 24 22:33:02.565919 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.565903 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.567859 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.567834 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 22:33:02.568013 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.567840 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6nxsq\"" Apr 24 22:33:02.568013 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.567977 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 22:33:02.568337 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.568322 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 22:33:02.568681 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.568663 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 22:33:02.568775 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.568717 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 22:33:02.568843 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.568802 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 22:33:02.666895 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.666871 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-sys\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.667020 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.666916 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-root\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.667020 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.666936 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-tls\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.667020 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.667008 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-metrics-client-ca\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.667176 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.667030 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-987f5\" (UniqueName: \"kubernetes.io/projected/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-kube-api-access-987f5\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.667176 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.667053 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-textfile\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.667176 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.667074 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.667176 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.667139 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-wtmp\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.667176 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.667165 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-accelerators-collector-config\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768272 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768249 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-metrics-client-ca\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768412 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768276 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-987f5\" (UniqueName: \"kubernetes.io/projected/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-kube-api-access-987f5\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768412 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768295 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-textfile\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768412 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768315 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768412 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768356 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-wtmp\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768412 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768382 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-accelerators-collector-config\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-sys\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768480 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-root\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768501 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-tls\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768519 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-wtmp\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768602 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-root\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768610 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-sys\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.768691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.768691 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-textfile\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.769115 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.769094 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-accelerators-collector-config\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.769503 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.769486 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-metrics-client-ca\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.770796 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.770781 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-tls\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.770929 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.770915 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.786354 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.786333 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-987f5\" (UniqueName: \"kubernetes.io/projected/f843b3fb-0ad4-46bb-b8ee-e06629cf2430-kube-api-access-987f5\") pod \"node-exporter-x7xzg\" (UID: \"f843b3fb-0ad4-46bb-b8ee-e06629cf2430\") " pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.875491 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:02.875431 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x7xzg" Apr 24 22:33:02.883436 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:33:02.883392 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf843b3fb_0ad4_46bb_b8ee_e06629cf2430.slice/crio-61cc6d8459eb430704e232bc15d1f752ffa30c2b4c9987dc5e99ae8f53970abe WatchSource:0}: Error finding container 61cc6d8459eb430704e232bc15d1f752ffa30c2b4c9987dc5e99ae8f53970abe: Status 404 returned error can't find the container with id 61cc6d8459eb430704e232bc15d1f752ffa30c2b4c9987dc5e99ae8f53970abe Apr 24 22:33:03.158797 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:03.158714 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x7xzg" event={"ID":"f843b3fb-0ad4-46bb-b8ee-e06629cf2430","Type":"ContainerStarted","Data":"61cc6d8459eb430704e232bc15d1f752ffa30c2b4c9987dc5e99ae8f53970abe"} Apr 24 22:33:04.117487 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.117456 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-788d94b556-2c8mw"] Apr 24 22:33:04.117847 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:33:04.117656 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-788d94b556-2c8mw" podUID="3545f18f-79bf-4b77-a115-1f3349c4650b" Apr 24 22:33:04.166027 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.165995 2565 generic.go:358] "Generic (PLEG): container finished" podID="f843b3fb-0ad4-46bb-b8ee-e06629cf2430" containerID="7faa21e3d77df0d1b77cd38181f9d46b50d4b3a092ffe73a33578f825b9035fe" exitCode=0 Apr 24 22:33:04.166193 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.166087 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x7xzg" event={"ID":"f843b3fb-0ad4-46bb-b8ee-e06629cf2430","Type":"ContainerDied","Data":"7faa21e3d77df0d1b77cd38181f9d46b50d4b3a092ffe73a33578f825b9035fe"} Apr 24 22:33:04.166266 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.166203 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:33:04.170412 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.170398 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:33:04.279889 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.279870 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-trusted-ca\") pod \"3545f18f-79bf-4b77-a115-1f3349c4650b\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " Apr 24 22:33:04.279989 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.279903 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-certificates\") pod \"3545f18f-79bf-4b77-a115-1f3349c4650b\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " Apr 24 22:33:04.279989 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.279932 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-installation-pull-secrets\") pod \"3545f18f-79bf-4b77-a115-1f3349c4650b\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " Apr 24 22:33:04.279989 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.279958 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-image-registry-private-configuration\") pod \"3545f18f-79bf-4b77-a115-1f3349c4650b\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " Apr 24 22:33:04.280197 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.280018 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rgd9\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-kube-api-access-5rgd9\") pod \"3545f18f-79bf-4b77-a115-1f3349c4650b\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " Apr 24 22:33:04.280197 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.280070 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3545f18f-79bf-4b77-a115-1f3349c4650b-ca-trust-extracted\") pod \"3545f18f-79bf-4b77-a115-1f3349c4650b\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " Apr 24 22:33:04.280197 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.280103 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-bound-sa-token\") pod \"3545f18f-79bf-4b77-a115-1f3349c4650b\" (UID: \"3545f18f-79bf-4b77-a115-1f3349c4650b\") " Apr 24 22:33:04.280356 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.280329 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3545f18f-79bf-4b77-a115-1f3349c4650b" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:04.280410 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.280373 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3545f18f-79bf-4b77-a115-1f3349c4650b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3545f18f-79bf-4b77-a115-1f3349c4650b" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:33:04.280476 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.280455 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3545f18f-79bf-4b77-a115-1f3349c4650b" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:33:04.280525 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.280506 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-trusted-ca\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:33:04.280565 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.280528 2565 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3545f18f-79bf-4b77-a115-1f3349c4650b-ca-trust-extracted\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:33:04.282438 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.282401 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3545f18f-79bf-4b77-a115-1f3349c4650b" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:04.282713 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.282688 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-kube-api-access-5rgd9" (OuterVolumeSpecName: "kube-api-access-5rgd9") pod "3545f18f-79bf-4b77-a115-1f3349c4650b" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b"). InnerVolumeSpecName "kube-api-access-5rgd9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:33:04.282874 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.282854 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3545f18f-79bf-4b77-a115-1f3349c4650b" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:04.283071 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.283043 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3545f18f-79bf-4b77-a115-1f3349c4650b" (UID: "3545f18f-79bf-4b77-a115-1f3349c4650b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:33:04.381698 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.381627 2565 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-bound-sa-token\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:33:04.381698 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.381651 2565 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-certificates\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:33:04.381698 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.381662 2565 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-installation-pull-secrets\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:33:04.381698 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.381672 2565 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3545f18f-79bf-4b77-a115-1f3349c4650b-image-registry-private-configuration\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:33:04.381698 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:04.381682 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5rgd9\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-kube-api-access-5rgd9\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:33:05.170219 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:05.170191 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-788d94b556-2c8mw" Apr 24 22:33:05.170636 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:05.170220 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x7xzg" event={"ID":"f843b3fb-0ad4-46bb-b8ee-e06629cf2430","Type":"ContainerStarted","Data":"3ee9d761d0338718d0aeb99df597e5e5d52a3476f195168747cf20d857c12de5"} Apr 24 22:33:05.170636 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:05.170255 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x7xzg" event={"ID":"f843b3fb-0ad4-46bb-b8ee-e06629cf2430","Type":"ContainerStarted","Data":"936e66c2fb8331671d2032baf7406e6b6ef4ad099e23ff4a1b41c44a2ee3e4f3"} Apr 24 22:33:05.196893 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:05.196853 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-x7xzg" podStartSLOduration=2.469630282 podStartE2EDuration="3.196839749s" podCreationTimestamp="2026-04-24 22:33:02 +0000 UTC" firstStartedPulling="2026-04-24 22:33:02.885276443 +0000 UTC m=+205.900960826" lastFinishedPulling="2026-04-24 22:33:03.612485901 +0000 UTC m=+206.628170293" observedRunningTime="2026-04-24 22:33:05.196228416 +0000 UTC m=+208.211912828" watchObservedRunningTime="2026-04-24 22:33:05.196839749 +0000 UTC m=+208.212524147" Apr 24 22:33:05.219779 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:05.219757 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-788d94b556-2c8mw"] Apr 24 22:33:05.223588 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:05.223556 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-788d94b556-2c8mw"] Apr 24 22:33:05.288806 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:05.288780 2565 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3545f18f-79bf-4b77-a115-1f3349c4650b-registry-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:33:05.575048 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:05.575009 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3545f18f-79bf-4b77-a115-1f3349c4650b" path="/var/lib/kubelet/pods/3545f18f-79bf-4b77-a115-1f3349c4650b/volumes" Apr 24 22:33:10.664886 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:10.664848 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" podUID="9d447531-8a44-4047-a2b0-0d208b808c15" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 22:33:10.665231 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:10.664912 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" Apr 24 22:33:10.665344 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:10.665315 2565 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"6135f5d94d77f7eb9936c5bce6ad7c9960bbe5d76fffa0d18ad20e2293e9541c"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 22:33:10.665380 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:10.665364 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" podUID="9d447531-8a44-4047-a2b0-0d208b808c15" containerName="service-proxy" containerID="cri-o://6135f5d94d77f7eb9936c5bce6ad7c9960bbe5d76fffa0d18ad20e2293e9541c" gracePeriod=30 Apr 24 22:33:11.186309 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:11.186279 2565 generic.go:358] "Generic (PLEG): container finished" podID="9d447531-8a44-4047-a2b0-0d208b808c15" containerID="6135f5d94d77f7eb9936c5bce6ad7c9960bbe5d76fffa0d18ad20e2293e9541c" exitCode=2 Apr 24 22:33:11.186463 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:11.186318 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" event={"ID":"9d447531-8a44-4047-a2b0-0d208b808c15","Type":"ContainerDied","Data":"6135f5d94d77f7eb9936c5bce6ad7c9960bbe5d76fffa0d18ad20e2293e9541c"} Apr 24 22:33:11.186463 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:11.186339 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8689464f96-zdzfz" event={"ID":"9d447531-8a44-4047-a2b0-0d208b808c15","Type":"ContainerStarted","Data":"c75cf608ab7cc50632b834fad640d24c4d80a853976d92133ae3e824db6210c9"} Apr 24 22:33:49.508795 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:49.508746 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:33:49.511049 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:49.511021 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ffaace54-8d28-433c-b3bf-e5664064b07e-metrics-certs\") pod \"network-metrics-daemon-9bjhd\" (UID: \"ffaace54-8d28-433c-b3bf-e5664064b07e\") " pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:33:49.573602 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:49.573564 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h92dk\"" Apr 24 22:33:49.582163 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:49.582143 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9bjhd" Apr 24 22:33:49.691473 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:49.691443 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9bjhd"] Apr 24 22:33:49.694496 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:33:49.694471 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffaace54_8d28_433c_b3bf_e5664064b07e.slice/crio-d2339c444a5cd414eff10d44e97d7bdf606e667397bdb4e2238b299fe6b1f98a WatchSource:0}: Error finding container d2339c444a5cd414eff10d44e97d7bdf606e667397bdb4e2238b299fe6b1f98a: Status 404 returned error can't find the container with id d2339c444a5cd414eff10d44e97d7bdf606e667397bdb4e2238b299fe6b1f98a Apr 24 22:33:50.276061 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:50.276029 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bjhd" event={"ID":"ffaace54-8d28-433c-b3bf-e5664064b07e","Type":"ContainerStarted","Data":"d2339c444a5cd414eff10d44e97d7bdf606e667397bdb4e2238b299fe6b1f98a"} Apr 24 22:33:51.280193 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:51.280165 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bjhd" event={"ID":"ffaace54-8d28-433c-b3bf-e5664064b07e","Type":"ContainerStarted","Data":"27e973f62e1e23cf141f9b14d800307f0c6216a78fba55d5a4f75c51c4693c76"} Apr 24 22:33:51.280193 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:51.280196 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9bjhd" event={"ID":"ffaace54-8d28-433c-b3bf-e5664064b07e","Type":"ContainerStarted","Data":"01788ee5d4c703c8dba39554da9c2e946586582ecfaf749cc3df5ec28cf48aab"} Apr 24 22:33:51.295513 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:33:51.295460 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9bjhd" podStartSLOduration=253.186411496 podStartE2EDuration="4m14.295444317s" podCreationTimestamp="2026-04-24 22:29:37 +0000 UTC" firstStartedPulling="2026-04-24 22:33:49.696687818 +0000 UTC m=+252.712372195" lastFinishedPulling="2026-04-24 22:33:50.805720639 +0000 UTC m=+253.821405016" observedRunningTime="2026-04-24 22:33:51.294772887 +0000 UTC m=+254.310457316" watchObservedRunningTime="2026-04-24 22:33:51.295444317 +0000 UTC m=+254.311128719" Apr 24 22:34:17.036397 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:34:17.036355 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7pxrr" podUID="57eedb74-e256-4612-845f-7dc838139e1f" Apr 24 22:34:17.036397 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:34:17.036355 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-t4vp4" podUID="a30d41a7-8c4f-4b0c-9cc0-a92a394596fe" Apr 24 22:34:17.036397 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:34:17.036355 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" podUID="2f6909d3-6ce8-4b0c-986f-82f40f5d2330" Apr 24 22:34:17.343222 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:17.343150 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:34:17.343354 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:17.343264 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:34:17.343354 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:17.343286 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4vp4" Apr 24 22:34:20.418058 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.418026 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:34:20.418524 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.418079 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:34:20.420355 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.420335 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a30d41a7-8c4f-4b0c-9cc0-a92a394596fe-metrics-tls\") pod \"dns-default-t4vp4\" (UID: \"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe\") " pod="openshift-dns/dns-default-t4vp4" Apr 24 22:34:20.420561 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.420541 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2f6909d3-6ce8-4b0c-986f-82f40f5d2330-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-txzvf\" (UID: \"2f6909d3-6ce8-4b0c-986f-82f40f5d2330\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:34:20.519191 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.519154 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:34:20.521389 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.521359 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57eedb74-e256-4612-845f-7dc838139e1f-cert\") pod \"ingress-canary-7pxrr\" (UID: \"57eedb74-e256-4612-845f-7dc838139e1f\") " pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:34:20.646509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.646478 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z8wpg\"" Apr 24 22:34:20.646509 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.646478 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jqx86\"" Apr 24 22:34:20.646760 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.646478 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zzr8g\"" Apr 24 22:34:20.654501 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.654479 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" Apr 24 22:34:20.654620 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.654507 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t4vp4" Apr 24 22:34:20.654620 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.654611 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7pxrr" Apr 24 22:34:20.787818 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.787773 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7pxrr"] Apr 24 22:34:20.792082 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:34:20.792054 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57eedb74_e256_4612_845f_7dc838139e1f.slice/crio-a979296a121d5a8a859d8a97c4adef70ec5cfd0b32523ec20be041bf8081c3cd WatchSource:0}: Error finding container a979296a121d5a8a859d8a97c4adef70ec5cfd0b32523ec20be041bf8081c3cd: Status 404 returned error can't find the container with id a979296a121d5a8a859d8a97c4adef70ec5cfd0b32523ec20be041bf8081c3cd Apr 24 22:34:20.807658 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.807170 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-txzvf"] Apr 24 22:34:20.810680 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:34:20.810596 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f6909d3_6ce8_4b0c_986f_82f40f5d2330.slice/crio-db68e01535dd4bd7fe690847d2e6f3d1506bce297f52f64b31f3f085caa91265 WatchSource:0}: Error finding container db68e01535dd4bd7fe690847d2e6f3d1506bce297f52f64b31f3f085caa91265: Status 404 returned error can't find the container with id db68e01535dd4bd7fe690847d2e6f3d1506bce297f52f64b31f3f085caa91265 Apr 24 22:34:20.831317 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:20.831293 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t4vp4"] Apr 24 22:34:20.833982 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:34:20.833959 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda30d41a7_8c4f_4b0c_9cc0_a92a394596fe.slice/crio-7a02f514c2b69ae3f27506a7714e54cf948541ac4f0f4bb6684a77ac82f1a1f7 WatchSource:0}: Error finding container 7a02f514c2b69ae3f27506a7714e54cf948541ac4f0f4bb6684a77ac82f1a1f7: Status 404 returned error can't find the container with id 7a02f514c2b69ae3f27506a7714e54cf948541ac4f0f4bb6684a77ac82f1a1f7 Apr 24 22:34:21.354547 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:21.354501 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7pxrr" event={"ID":"57eedb74-e256-4612-845f-7dc838139e1f","Type":"ContainerStarted","Data":"a979296a121d5a8a859d8a97c4adef70ec5cfd0b32523ec20be041bf8081c3cd"} Apr 24 22:34:21.355599 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:21.355555 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" event={"ID":"2f6909d3-6ce8-4b0c-986f-82f40f5d2330","Type":"ContainerStarted","Data":"db68e01535dd4bd7fe690847d2e6f3d1506bce297f52f64b31f3f085caa91265"} Apr 24 22:34:21.356604 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:21.356561 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4vp4" event={"ID":"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe","Type":"ContainerStarted","Data":"7a02f514c2b69ae3f27506a7714e54cf948541ac4f0f4bb6684a77ac82f1a1f7"} Apr 24 22:34:23.364937 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:23.364902 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4vp4" event={"ID":"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe","Type":"ContainerStarted","Data":"214aeaed6bbeb2c4282feecb11ae13a612965294a2a33983cc93a9f123c6c2b6"} Apr 24 22:34:23.365396 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:23.364945 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t4vp4" event={"ID":"a30d41a7-8c4f-4b0c-9cc0-a92a394596fe","Type":"ContainerStarted","Data":"ffd8cce482330f793b8e53bc48a42b839cf09201f9412d4fda88447c79571785"} Apr 24 22:34:23.365396 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:23.365050 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-t4vp4" Apr 24 22:34:23.366316 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:23.366295 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7pxrr" event={"ID":"57eedb74-e256-4612-845f-7dc838139e1f","Type":"ContainerStarted","Data":"c0eb3eba4556d7f899efd8fb388cc39d15fb1f494a08c554f95f9a7e5e765805"} Apr 24 22:34:23.367481 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:23.367459 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" event={"ID":"2f6909d3-6ce8-4b0c-986f-82f40f5d2330","Type":"ContainerStarted","Data":"f146e3bdb5284c9ac1bf39e3ceb995dec37460e2f1011f5db1867475e1a97d4b"} Apr 24 22:34:23.381253 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:23.381180 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t4vp4" podStartSLOduration=251.362982133 podStartE2EDuration="4m13.381164369s" podCreationTimestamp="2026-04-24 22:30:10 +0000 UTC" firstStartedPulling="2026-04-24 22:34:20.835544333 +0000 UTC m=+283.851228713" lastFinishedPulling="2026-04-24 22:34:22.853726558 +0000 UTC m=+285.869410949" observedRunningTime="2026-04-24 22:34:23.380473484 +0000 UTC m=+286.396157882" watchObservedRunningTime="2026-04-24 22:34:23.381164369 +0000 UTC m=+286.396848749" Apr 24 22:34:23.395618 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:23.395558 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-txzvf" podStartSLOduration=267.360762315 podStartE2EDuration="4m29.395549289s" podCreationTimestamp="2026-04-24 22:29:54 +0000 UTC" firstStartedPulling="2026-04-24 22:34:20.814062539 +0000 UTC m=+283.829746926" lastFinishedPulling="2026-04-24 22:34:22.84884952 +0000 UTC m=+285.864533900" observedRunningTime="2026-04-24 22:34:23.394638563 +0000 UTC m=+286.410322961" watchObservedRunningTime="2026-04-24 22:34:23.395549289 +0000 UTC m=+286.411233688" Apr 24 22:34:23.412797 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:23.412756 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7pxrr" podStartSLOduration=251.348889823 podStartE2EDuration="4m13.412742571s" podCreationTimestamp="2026-04-24 22:30:10 +0000 UTC" firstStartedPulling="2026-04-24 22:34:20.794376874 +0000 UTC m=+283.810061251" lastFinishedPulling="2026-04-24 22:34:22.858229607 +0000 UTC m=+285.873913999" observedRunningTime="2026-04-24 22:34:23.411256902 +0000 UTC m=+286.426941300" watchObservedRunningTime="2026-04-24 22:34:23.412742571 +0000 UTC m=+286.428426971" Apr 24 22:34:33.373361 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:33.373333 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t4vp4" Apr 24 22:34:37.504180 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:37.504153 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:34:37.505308 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:37.505282 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:34:37.509386 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:34:37.509366 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 22:35:52.432508 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.432422 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp"] Apr 24 22:35:52.435456 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.435435 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:52.437867 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.437845 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 22:35:52.442622 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.442605 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 22:35:52.460010 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.459992 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 22:35:52.460231 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.460216 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 22:35:52.460629 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.460602 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 22:35:52.460735 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.460663 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-8knqq\"" Apr 24 22:35:52.461587 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.461554 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp"] Apr 24 22:35:52.511300 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.511270 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/374718eb-622e-492f-8590-defb8165eacf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:52.511427 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.511327 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:52.511427 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.511401 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbcm8\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-kube-api-access-wbcm8\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:52.612173 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.612147 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbcm8\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-kube-api-access-wbcm8\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:52.612320 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.612192 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/374718eb-622e-492f-8590-defb8165eacf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:52.612320 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.612241 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:52.612393 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:52.612324 2565 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:35:52.612393 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:52.612338 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:35:52.612393 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:52.612356 2565 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 22:35:52.612393 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:52.612375 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 22:35:52.612545 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:52.612440 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates podName:374718eb-622e-492f-8590-defb8165eacf nodeName:}" failed. No retries permitted until 2026-04-24 22:35:53.112422921 +0000 UTC m=+376.128107301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates") pod "keda-metrics-apiserver-7c9f485588-94bvp" (UID: "374718eb-622e-492f-8590-defb8165eacf") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 22:35:52.612630 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.612612 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/374718eb-622e-492f-8590-defb8165eacf-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:52.621322 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:52.621301 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbcm8\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-kube-api-access-wbcm8\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:53.116979 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:53.116934 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:53.117193 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:53.117096 2565 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:35:53.117193 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:53.117118 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:35:53.117193 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:53.117137 2565 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 22:35:53.117193 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:53.117155 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 22:35:53.117389 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:53.117230 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates podName:374718eb-622e-492f-8590-defb8165eacf nodeName:}" failed. No retries permitted until 2026-04-24 22:35:54.117211564 +0000 UTC m=+377.132895942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates") pod "keda-metrics-apiserver-7c9f485588-94bvp" (UID: "374718eb-622e-492f-8590-defb8165eacf") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 22:35:54.125661 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:54.125630 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:54.126042 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:54.125736 2565 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:35:54.126042 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:54.125747 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:35:54.126042 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:54.125763 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp: references non-existent secret key: tls.crt Apr 24 22:35:54.126042 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:54.125808 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates podName:374718eb-622e-492f-8590-defb8165eacf nodeName:}" failed. No retries permitted until 2026-04-24 22:35:56.125796157 +0000 UTC m=+379.141480534 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates") pod "keda-metrics-apiserver-7c9f485588-94bvp" (UID: "374718eb-622e-492f-8590-defb8165eacf") : references non-existent secret key: tls.crt Apr 24 22:35:56.142192 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:35:56.142146 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:35:56.142623 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:56.142257 2565 secret.go:281] references non-existent secret key: tls.crt Apr 24 22:35:56.142623 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:56.142270 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 22:35:56.142623 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:56.142286 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp: references non-existent secret key: tls.crt Apr 24 22:35:56.142623 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:35:56.142341 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates podName:374718eb-622e-492f-8590-defb8165eacf nodeName:}" failed. No retries permitted until 2026-04-24 22:36:00.14232899 +0000 UTC m=+383.158013367 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates") pod "keda-metrics-apiserver-7c9f485588-94bvp" (UID: "374718eb-622e-492f-8590-defb8165eacf") : references non-existent secret key: tls.crt Apr 24 22:36:00.169353 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:00.169298 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:36:00.171882 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:00.171859 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/374718eb-622e-492f-8590-defb8165eacf-certificates\") pod \"keda-metrics-apiserver-7c9f485588-94bvp\" (UID: \"374718eb-622e-492f-8590-defb8165eacf\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:36:00.244790 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:00.244757 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:36:00.356503 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:00.356469 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp"] Apr 24 22:36:00.359492 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:36:00.359460 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod374718eb_622e_492f_8590_defb8165eacf.slice/crio-f1719bf9bb4def1efa4c0bfa949fe4c002d39c4d36fd44904e1664d1a1590102 WatchSource:0}: Error finding container f1719bf9bb4def1efa4c0bfa949fe4c002d39c4d36fd44904e1664d1a1590102: Status 404 returned error can't find the container with id f1719bf9bb4def1efa4c0bfa949fe4c002d39c4d36fd44904e1664d1a1590102 Apr 24 22:36:00.360622 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:00.360603 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:36:00.606153 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:00.606121 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" event={"ID":"374718eb-622e-492f-8590-defb8165eacf","Type":"ContainerStarted","Data":"f1719bf9bb4def1efa4c0bfa949fe4c002d39c4d36fd44904e1664d1a1590102"} Apr 24 22:36:03.615922 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:03.615884 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" event={"ID":"374718eb-622e-492f-8590-defb8165eacf","Type":"ContainerStarted","Data":"98cf48f5b33c07296793ef67541017057244495bc14c6e1188831ab1f3673870"} Apr 24 22:36:03.616295 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:03.616117 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:36:03.632931 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:03.632882 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" podStartSLOduration=8.779700796 podStartE2EDuration="11.632866225s" podCreationTimestamp="2026-04-24 22:35:52 +0000 UTC" firstStartedPulling="2026-04-24 22:36:00.360734083 +0000 UTC m=+383.376418460" lastFinishedPulling="2026-04-24 22:36:03.213899499 +0000 UTC m=+386.229583889" observedRunningTime="2026-04-24 22:36:03.63122438 +0000 UTC m=+386.646908778" watchObservedRunningTime="2026-04-24 22:36:03.632866225 +0000 UTC m=+386.648550630" Apr 24 22:36:14.622867 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:36:14.622842 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-94bvp" Apr 24 22:37:02.485642 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.485551 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6fsph"] Apr 24 22:37:02.488786 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.488766 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:02.490604 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.490564 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 22:37:02.490743 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.490724 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 22:37:02.490818 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.490742 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 22:37:02.491051 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.491035 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-nl89p\"" Apr 24 22:37:02.497058 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.497036 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6fsph"] Apr 24 22:37:02.603514 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.603471 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjdn\" (UniqueName: \"kubernetes.io/projected/a096ba47-2ec2-494d-830b-c280b7ccced6-kube-api-access-8kjdn\") pod \"llmisvc-controller-manager-68cc5db7c4-6fsph\" (UID: \"a096ba47-2ec2-494d-830b-c280b7ccced6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:02.603689 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.603556 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a096ba47-2ec2-494d-830b-c280b7ccced6-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6fsph\" (UID: \"a096ba47-2ec2-494d-830b-c280b7ccced6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:02.704897 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.704862 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a096ba47-2ec2-494d-830b-c280b7ccced6-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6fsph\" (UID: \"a096ba47-2ec2-494d-830b-c280b7ccced6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:02.705082 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.704929 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kjdn\" (UniqueName: \"kubernetes.io/projected/a096ba47-2ec2-494d-830b-c280b7ccced6-kube-api-access-8kjdn\") pod \"llmisvc-controller-manager-68cc5db7c4-6fsph\" (UID: \"a096ba47-2ec2-494d-830b-c280b7ccced6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:02.707302 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.707277 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a096ba47-2ec2-494d-830b-c280b7ccced6-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-6fsph\" (UID: \"a096ba47-2ec2-494d-830b-c280b7ccced6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:02.713389 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.713369 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kjdn\" (UniqueName: \"kubernetes.io/projected/a096ba47-2ec2-494d-830b-c280b7ccced6-kube-api-access-8kjdn\") pod \"llmisvc-controller-manager-68cc5db7c4-6fsph\" (UID: \"a096ba47-2ec2-494d-830b-c280b7ccced6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:02.798993 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.798925 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:02.908090 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:02.908061 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-6fsph"] Apr 24 22:37:02.911086 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:37:02.911058 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda096ba47_2ec2_494d_830b_c280b7ccced6.slice/crio-d9841fa5a15111033953c88c5dbfeb3e5576d80758b40245b7b2c210078d3c9f WatchSource:0}: Error finding container d9841fa5a15111033953c88c5dbfeb3e5576d80758b40245b7b2c210078d3c9f: Status 404 returned error can't find the container with id d9841fa5a15111033953c88c5dbfeb3e5576d80758b40245b7b2c210078d3c9f Apr 24 22:37:03.769763 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:03.769726 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" event={"ID":"a096ba47-2ec2-494d-830b-c280b7ccced6","Type":"ContainerStarted","Data":"d9841fa5a15111033953c88c5dbfeb3e5576d80758b40245b7b2c210078d3c9f"} Apr 24 22:37:05.775328 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:05.775289 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" event={"ID":"a096ba47-2ec2-494d-830b-c280b7ccced6","Type":"ContainerStarted","Data":"52e841256ee61521eb0a363c0a8c45809e862a149f44505227a93508ed440509"} Apr 24 22:37:05.775769 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:05.775499 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:37:05.792493 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:05.792443 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" podStartSLOduration=1.994920199 podStartE2EDuration="3.792431458s" podCreationTimestamp="2026-04-24 22:37:02 +0000 UTC" firstStartedPulling="2026-04-24 22:37:02.912308584 +0000 UTC m=+445.927992960" lastFinishedPulling="2026-04-24 22:37:04.709819842 +0000 UTC m=+447.725504219" observedRunningTime="2026-04-24 22:37:05.791240173 +0000 UTC m=+448.806924582" watchObservedRunningTime="2026-04-24 22:37:05.792431458 +0000 UTC m=+448.808115856" Apr 24 22:37:36.779927 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:37:36.779900 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-6fsph" Apr 24 22:38:11.917697 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:11.917643 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-82wxw"] Apr 24 22:38:11.921051 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:11.921032 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:11.924728 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:11.924707 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-rfm9x\"" Apr 24 22:38:11.924880 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:11.924862 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 22:38:11.939474 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:11.939453 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-82wxw"] Apr 24 22:38:11.992142 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:11.992108 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg6cd\" (UniqueName: \"kubernetes.io/projected/ba03a81b-c63a-4992-80ed-f1a2fcadc8f6-kube-api-access-rg6cd\") pod \"odh-model-controller-696fc77849-82wxw\" (UID: \"ba03a81b-c63a-4992-80ed-f1a2fcadc8f6\") " pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:11.992317 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:11.992180 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba03a81b-c63a-4992-80ed-f1a2fcadc8f6-cert\") pod \"odh-model-controller-696fc77849-82wxw\" (UID: \"ba03a81b-c63a-4992-80ed-f1a2fcadc8f6\") " pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:12.092929 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:12.092902 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg6cd\" (UniqueName: \"kubernetes.io/projected/ba03a81b-c63a-4992-80ed-f1a2fcadc8f6-kube-api-access-rg6cd\") pod \"odh-model-controller-696fc77849-82wxw\" (UID: \"ba03a81b-c63a-4992-80ed-f1a2fcadc8f6\") " pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:12.093080 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:12.092949 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba03a81b-c63a-4992-80ed-f1a2fcadc8f6-cert\") pod \"odh-model-controller-696fc77849-82wxw\" (UID: \"ba03a81b-c63a-4992-80ed-f1a2fcadc8f6\") " pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:12.095282 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:12.095252 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba03a81b-c63a-4992-80ed-f1a2fcadc8f6-cert\") pod \"odh-model-controller-696fc77849-82wxw\" (UID: \"ba03a81b-c63a-4992-80ed-f1a2fcadc8f6\") " pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:12.102266 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:12.102234 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg6cd\" (UniqueName: \"kubernetes.io/projected/ba03a81b-c63a-4992-80ed-f1a2fcadc8f6-kube-api-access-rg6cd\") pod \"odh-model-controller-696fc77849-82wxw\" (UID: \"ba03a81b-c63a-4992-80ed-f1a2fcadc8f6\") " pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:12.230589 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:12.230547 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:12.341075 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:12.341043 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-82wxw"] Apr 24 22:38:12.344161 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:38:12.344122 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba03a81b_c63a_4992_80ed_f1a2fcadc8f6.slice/crio-757ebe69ddd24bdcd8d6813b4b0f0bb85bcfe230e07d798c2b0da759fa06104b WatchSource:0}: Error finding container 757ebe69ddd24bdcd8d6813b4b0f0bb85bcfe230e07d798c2b0da759fa06104b: Status 404 returned error can't find the container with id 757ebe69ddd24bdcd8d6813b4b0f0bb85bcfe230e07d798c2b0da759fa06104b Apr 24 22:38:12.951908 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:12.951867 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-82wxw" event={"ID":"ba03a81b-c63a-4992-80ed-f1a2fcadc8f6","Type":"ContainerStarted","Data":"757ebe69ddd24bdcd8d6813b4b0f0bb85bcfe230e07d798c2b0da759fa06104b"} Apr 24 22:38:14.957361 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:14.957322 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-82wxw" event={"ID":"ba03a81b-c63a-4992-80ed-f1a2fcadc8f6","Type":"ContainerStarted","Data":"590821dc2e76c5af9d996851acecc44260e6dfbeb2eb0db637c35433a87ba80c"} Apr 24 22:38:14.957843 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:14.957422 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:14.972355 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:14.972302 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-82wxw" podStartSLOduration=1.488483396 podStartE2EDuration="3.972289693s" podCreationTimestamp="2026-04-24 22:38:11 +0000 UTC" firstStartedPulling="2026-04-24 22:38:12.345338189 +0000 UTC m=+515.361022567" lastFinishedPulling="2026-04-24 22:38:14.829144484 +0000 UTC m=+517.844828864" observedRunningTime="2026-04-24 22:38:14.971798149 +0000 UTC m=+517.987482549" watchObservedRunningTime="2026-04-24 22:38:14.972289693 +0000 UTC m=+517.987974091" Apr 24 22:38:25.963429 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:25.963351 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-82wxw" Apr 24 22:38:46.901181 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.901146 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b"] Apr 24 22:38:46.904487 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.904468 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:46.907031 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.907005 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1b9dc-predictor-serving-cert\"" Apr 24 22:38:46.907397 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.907378 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 22:38:46.907397 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.907385 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-58j9x\"" Apr 24 22:38:46.907562 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.907425 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 22:38:46.907562 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.907425 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\"" Apr 24 22:38:46.912751 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.912728 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b"] Apr 24 22:38:46.935758 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.935730 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:46.935903 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.935761 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5fkw\" (UniqueName: \"kubernetes.io/projected/1239da72-897f-4238-97c9-d1096f1fea83-kube-api-access-g5fkw\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:46.935903 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:46.935800 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1239da72-897f-4238-97c9-d1096f1fea83-success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.036904 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.036861 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.036904 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.036894 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5fkw\" (UniqueName: \"kubernetes.io/projected/1239da72-897f-4238-97c9-d1096f1fea83-kube-api-access-g5fkw\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.037138 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.036938 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1239da72-897f-4238-97c9-d1096f1fea83-success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.037138 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:38:47.037003 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-serving-cert: secret "success-200-isvc-1b9dc-predictor-serving-cert" not found Apr 24 22:38:47.037138 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:38:47.037073 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls podName:1239da72-897f-4238-97c9-d1096f1fea83 nodeName:}" failed. No retries permitted until 2026-04-24 22:38:47.537054146 +0000 UTC m=+550.552738534 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls") pod "success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" (UID: "1239da72-897f-4238-97c9-d1096f1fea83") : secret "success-200-isvc-1b9dc-predictor-serving-cert" not found Apr 24 22:38:47.037642 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.037618 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1239da72-897f-4238-97c9-d1096f1fea83-success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.049197 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.049152 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5fkw\" (UniqueName: \"kubernetes.io/projected/1239da72-897f-4238-97c9-d1096f1fea83-kube-api-access-g5fkw\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.151238 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.151154 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2"] Apr 24 22:38:47.154831 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.154808 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.156829 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.156808 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 24 22:38:47.156930 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.156840 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 24 22:38:47.163675 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.163654 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2"] Apr 24 22:38:47.238854 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.238817 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c9f152-cc03-467c-b33e-c1c8523e218d-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.239025 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.238863 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.239025 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.238892 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bndxf\" (UniqueName: \"kubernetes.io/projected/79c9f152-cc03-467c-b33e-c1c8523e218d-kube-api-access-bndxf\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.239025 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.238951 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79c9f152-cc03-467c-b33e-c1c8523e218d-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.340134 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.340102 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bndxf\" (UniqueName: \"kubernetes.io/projected/79c9f152-cc03-467c-b33e-c1c8523e218d-kube-api-access-bndxf\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.340287 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.340140 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79c9f152-cc03-467c-b33e-c1c8523e218d-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.340287 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.340197 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c9f152-cc03-467c-b33e-c1c8523e218d-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.340287 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.340220 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.340441 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:38:47.340317 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-predictor-serving-cert: secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 24 22:38:47.340441 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:38:47.340384 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls podName:79c9f152-cc03-467c-b33e-c1c8523e218d nodeName:}" failed. No retries permitted until 2026-04-24 22:38:47.840366229 +0000 UTC m=+550.856050605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls") pod "isvc-xgboost-graph-predictor-669d8d6456-p5mg2" (UID: "79c9f152-cc03-467c-b33e-c1c8523e218d") : secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 24 22:38:47.340681 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.340659 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c9f152-cc03-467c-b33e-c1c8523e218d-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.340892 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.340874 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79c9f152-cc03-467c-b33e-c1c8523e218d-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.359026 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.358996 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bndxf\" (UniqueName: \"kubernetes.io/projected/79c9f152-cc03-467c-b33e-c1c8523e218d-kube-api-access-bndxf\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.542156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.542116 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.544748 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.544722 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls\") pod \"success-200-isvc-1b9dc-predictor-68868569ff-tdd6b\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.815268 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.815181 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:38:47.844468 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.844430 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.846877 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.846847 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-p5mg2\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:47.936934 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:47.936904 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b"] Apr 24 22:38:47.939941 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:38:47.939912 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1239da72_897f_4238_97c9_d1096f1fea83.slice/crio-0dff1ff766d6d44683ed9f64ab0e05f6c85fd4ca8061c676a7c9c735a2bbfc73 WatchSource:0}: Error finding container 0dff1ff766d6d44683ed9f64ab0e05f6c85fd4ca8061c676a7c9c735a2bbfc73: Status 404 returned error can't find the container with id 0dff1ff766d6d44683ed9f64ab0e05f6c85fd4ca8061c676a7c9c735a2bbfc73 Apr 24 22:38:48.042937 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:48.042900 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" event={"ID":"1239da72-897f-4238-97c9-d1096f1fea83","Type":"ContainerStarted","Data":"0dff1ff766d6d44683ed9f64ab0e05f6c85fd4ca8061c676a7c9c735a2bbfc73"} Apr 24 22:38:48.065911 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:48.065824 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:38:48.182409 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:48.182373 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2"] Apr 24 22:38:48.185124 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:38:48.185090 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c9f152_cc03_467c_b33e_c1c8523e218d.slice/crio-3fab459caa182fd5e6ed8b00d08b2e1be218e060b71a6872bc5d2102fc6cc446 WatchSource:0}: Error finding container 3fab459caa182fd5e6ed8b00d08b2e1be218e060b71a6872bc5d2102fc6cc446: Status 404 returned error can't find the container with id 3fab459caa182fd5e6ed8b00d08b2e1be218e060b71a6872bc5d2102fc6cc446 Apr 24 22:38:49.051451 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:38:49.051414 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" event={"ID":"79c9f152-cc03-467c-b33e-c1c8523e218d","Type":"ContainerStarted","Data":"3fab459caa182fd5e6ed8b00d08b2e1be218e060b71a6872bc5d2102fc6cc446"} Apr 24 22:39:02.094317 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:02.094278 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" event={"ID":"1239da72-897f-4238-97c9-d1096f1fea83","Type":"ContainerStarted","Data":"c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce"} Apr 24 22:39:02.095630 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:02.095603 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" event={"ID":"79c9f152-cc03-467c-b33e-c1c8523e218d","Type":"ContainerStarted","Data":"cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c"} Apr 24 22:39:05.106954 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:05.106916 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" event={"ID":"1239da72-897f-4238-97c9-d1096f1fea83","Type":"ContainerStarted","Data":"e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d"} Apr 24 22:39:05.107344 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:05.107107 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:39:05.124739 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:05.124692 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podStartSLOduration=3.0117128 podStartE2EDuration="19.124676622s" podCreationTimestamp="2026-04-24 22:38:46 +0000 UTC" firstStartedPulling="2026-04-24 22:38:47.941648518 +0000 UTC m=+550.957332894" lastFinishedPulling="2026-04-24 22:39:04.054612335 +0000 UTC m=+567.070296716" observedRunningTime="2026-04-24 22:39:05.123461502 +0000 UTC m=+568.139145901" watchObservedRunningTime="2026-04-24 22:39:05.124676622 +0000 UTC m=+568.140361020" Apr 24 22:39:06.110609 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:06.110558 2565 generic.go:358] "Generic (PLEG): container finished" podID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerID="cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c" exitCode=0 Apr 24 22:39:06.111057 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:06.110630 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" event={"ID":"79c9f152-cc03-467c-b33e-c1c8523e218d","Type":"ContainerDied","Data":"cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c"} Apr 24 22:39:06.111057 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:06.110903 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:39:06.112350 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:06.112325 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 24 22:39:07.113829 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:07.113785 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 24 22:39:12.119033 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:12.118983 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:39:12.119589 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:12.119534 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 24 22:39:22.119619 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:22.119550 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 24 22:39:24.165287 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:24.165251 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" event={"ID":"79c9f152-cc03-467c-b33e-c1c8523e218d","Type":"ContainerStarted","Data":"638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca"} Apr 24 22:39:24.165287 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:24.165292 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" event={"ID":"79c9f152-cc03-467c-b33e-c1c8523e218d","Type":"ContainerStarted","Data":"4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f"} Apr 24 22:39:24.165733 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:24.165623 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:39:24.165777 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:24.165738 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:39:24.166896 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:24.166870 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:39:24.182934 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:24.182895 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podStartSLOduration=1.7662137 podStartE2EDuration="37.182883072s" podCreationTimestamp="2026-04-24 22:38:47 +0000 UTC" firstStartedPulling="2026-04-24 22:38:48.187061288 +0000 UTC m=+551.202745670" lastFinishedPulling="2026-04-24 22:39:23.603730661 +0000 UTC m=+586.619415042" observedRunningTime="2026-04-24 22:39:24.18138883 +0000 UTC m=+587.197073238" watchObservedRunningTime="2026-04-24 22:39:24.182883072 +0000 UTC m=+587.198567471" Apr 24 22:39:25.168423 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:25.168383 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:39:30.173112 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:30.173085 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:39:30.173729 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:30.173701 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:39:32.120402 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:32.120363 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 24 22:39:37.524417 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:37.524390 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:39:37.524990 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:37.524969 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:39:40.174329 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:40.174286 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:39:42.119765 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:42.119725 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 24 22:39:50.174672 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:50.174630 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:39:52.120659 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:39:52.120586 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:40:00.174035 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:00.173996 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:40:10.173915 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:10.173875 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:40:17.039316 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.039282 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b"] Apr 24 22:40:17.041829 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.039614 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" containerID="cri-o://c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce" gracePeriod=30 Apr 24 22:40:17.041829 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.039657 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kube-rbac-proxy" containerID="cri-o://e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d" gracePeriod=30 Apr 24 22:40:17.114834 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.114795 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.17:8643/healthz\": dial tcp 10.132.0.17:8643: connect: connection refused" Apr 24 22:40:17.158770 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.158718 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq"] Apr 24 22:40:17.161785 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.161766 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.164129 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.164101 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e938d-predictor-serving-cert\"" Apr 24 22:40:17.164272 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.164253 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e938d-kube-rbac-proxy-sar-config\"" Apr 24 22:40:17.190640 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.190600 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq"] Apr 24 22:40:17.274115 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.274073 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/538ad1fd-f113-4c14-aaab-910c458d0d9b-success-200-isvc-e938d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.274293 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.274124 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpk7p\" (UniqueName: \"kubernetes.io/projected/538ad1fd-f113-4c14-aaab-910c458d0d9b-kube-api-access-hpk7p\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.274293 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.274154 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.318371 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.318287 2565 generic.go:358] "Generic (PLEG): container finished" podID="1239da72-897f-4238-97c9-d1096f1fea83" containerID="e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d" exitCode=2 Apr 24 22:40:17.318371 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.318324 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" event={"ID":"1239da72-897f-4238-97c9-d1096f1fea83","Type":"ContainerDied","Data":"e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d"} Apr 24 22:40:17.374917 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.374885 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/538ad1fd-f113-4c14-aaab-910c458d0d9b-success-200-isvc-e938d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.375093 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.374933 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpk7p\" (UniqueName: \"kubernetes.io/projected/538ad1fd-f113-4c14-aaab-910c458d0d9b-kube-api-access-hpk7p\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.375093 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.374962 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.375219 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:40:17.375092 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-e938d-predictor-serving-cert: secret "success-200-isvc-e938d-predictor-serving-cert" not found Apr 24 22:40:17.375219 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:40:17.375161 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls podName:538ad1fd-f113-4c14-aaab-910c458d0d9b nodeName:}" failed. No retries permitted until 2026-04-24 22:40:17.875145765 +0000 UTC m=+640.890830146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls") pod "success-200-isvc-e938d-predictor-858f86b7bc-ckndq" (UID: "538ad1fd-f113-4c14-aaab-910c458d0d9b") : secret "success-200-isvc-e938d-predictor-serving-cert" not found Apr 24 22:40:17.375543 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.375520 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/538ad1fd-f113-4c14-aaab-910c458d0d9b-success-200-isvc-e938d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.383241 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.383216 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpk7p\" (UniqueName: \"kubernetes.io/projected/538ad1fd-f113-4c14-aaab-910c458d0d9b-kube-api-access-hpk7p\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.879063 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.879009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:17.881390 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:17.881367 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls\") pod \"success-200-isvc-e938d-predictor-858f86b7bc-ckndq\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:18.071315 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:18.071283 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:18.193028 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:18.192972 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq"] Apr 24 22:40:18.195676 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:40:18.195646 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod538ad1fd_f113_4c14_aaab_910c458d0d9b.slice/crio-563d32d631e5a28ba7db83c84f08cfebd60eda4732cdf352e805374d28c95e0b WatchSource:0}: Error finding container 563d32d631e5a28ba7db83c84f08cfebd60eda4732cdf352e805374d28c95e0b: Status 404 returned error can't find the container with id 563d32d631e5a28ba7db83c84f08cfebd60eda4732cdf352e805374d28c95e0b Apr 24 22:40:18.325273 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:18.325232 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" event={"ID":"538ad1fd-f113-4c14-aaab-910c458d0d9b","Type":"ContainerStarted","Data":"52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193"} Apr 24 22:40:18.325273 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:18.325277 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" event={"ID":"538ad1fd-f113-4c14-aaab-910c458d0d9b","Type":"ContainerStarted","Data":"563d32d631e5a28ba7db83c84f08cfebd60eda4732cdf352e805374d28c95e0b"} Apr 24 22:40:19.330648 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:19.330611 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" event={"ID":"538ad1fd-f113-4c14-aaab-910c458d0d9b","Type":"ContainerStarted","Data":"edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4"} Apr 24 22:40:19.331041 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:19.330760 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:19.348406 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:19.348360 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podStartSLOduration=2.348346604 podStartE2EDuration="2.348346604s" podCreationTimestamp="2026-04-24 22:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:40:19.347629263 +0000 UTC m=+642.363313659" watchObservedRunningTime="2026-04-24 22:40:19.348346604 +0000 UTC m=+642.364031087" Apr 24 22:40:20.174461 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.174422 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:40:20.334527 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.334497 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:20.335849 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.335809 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 24 22:40:20.578631 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.578608 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:40:20.703467 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.703375 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5fkw\" (UniqueName: \"kubernetes.io/projected/1239da72-897f-4238-97c9-d1096f1fea83-kube-api-access-g5fkw\") pod \"1239da72-897f-4238-97c9-d1096f1fea83\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " Apr 24 22:40:20.703467 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.703417 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1239da72-897f-4238-97c9-d1096f1fea83-success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\") pod \"1239da72-897f-4238-97c9-d1096f1fea83\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " Apr 24 22:40:20.703467 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.703454 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls\") pod \"1239da72-897f-4238-97c9-d1096f1fea83\" (UID: \"1239da72-897f-4238-97c9-d1096f1fea83\") " Apr 24 22:40:20.703945 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.703918 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1239da72-897f-4238-97c9-d1096f1fea83-success-200-isvc-1b9dc-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-1b9dc-kube-rbac-proxy-sar-config") pod "1239da72-897f-4238-97c9-d1096f1fea83" (UID: "1239da72-897f-4238-97c9-d1096f1fea83"). InnerVolumeSpecName "success-200-isvc-1b9dc-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:40:20.705810 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.705781 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1239da72-897f-4238-97c9-d1096f1fea83" (UID: "1239da72-897f-4238-97c9-d1096f1fea83"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:40:20.705898 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.705830 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1239da72-897f-4238-97c9-d1096f1fea83-kube-api-access-g5fkw" (OuterVolumeSpecName: "kube-api-access-g5fkw") pod "1239da72-897f-4238-97c9-d1096f1fea83" (UID: "1239da72-897f-4238-97c9-d1096f1fea83"). InnerVolumeSpecName "kube-api-access-g5fkw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:40:20.804207 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.804146 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5fkw\" (UniqueName: \"kubernetes.io/projected/1239da72-897f-4238-97c9-d1096f1fea83-kube-api-access-g5fkw\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:40:20.804207 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.804201 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1239da72-897f-4238-97c9-d1096f1fea83-success-200-isvc-1b9dc-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:40:20.804207 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:20.804214 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1239da72-897f-4238-97c9-d1096f1fea83-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:40:21.338860 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.338824 2565 generic.go:358] "Generic (PLEG): container finished" podID="1239da72-897f-4238-97c9-d1096f1fea83" containerID="c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce" exitCode=0 Apr 24 22:40:21.339370 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.338922 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" Apr 24 22:40:21.339370 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.338912 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" event={"ID":"1239da72-897f-4238-97c9-d1096f1fea83","Type":"ContainerDied","Data":"c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce"} Apr 24 22:40:21.339370 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.339025 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b" event={"ID":"1239da72-897f-4238-97c9-d1096f1fea83","Type":"ContainerDied","Data":"0dff1ff766d6d44683ed9f64ab0e05f6c85fd4ca8061c676a7c9c735a2bbfc73"} Apr 24 22:40:21.339370 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.339042 2565 scope.go:117] "RemoveContainer" containerID="e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d" Apr 24 22:40:21.339783 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.339751 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 24 22:40:21.347610 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.347589 2565 scope.go:117] "RemoveContainer" containerID="c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce" Apr 24 22:40:21.354933 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.354916 2565 scope.go:117] "RemoveContainer" containerID="e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d" Apr 24 22:40:21.355187 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:40:21.355168 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d\": container with ID starting with e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d not found: ID does not exist" containerID="e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d" Apr 24 22:40:21.355241 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.355198 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d"} err="failed to get container status \"e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d\": rpc error: code = NotFound desc = could not find container \"e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d\": container with ID starting with e6a2633175f9b70fb84ae186794468ad0846c27b9661aac92340a94e180c889d not found: ID does not exist" Apr 24 22:40:21.355241 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.355216 2565 scope.go:117] "RemoveContainer" containerID="c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce" Apr 24 22:40:21.355447 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:40:21.355430 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce\": container with ID starting with c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce not found: ID does not exist" containerID="c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce" Apr 24 22:40:21.355495 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.355457 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce"} err="failed to get container status \"c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce\": rpc error: code = NotFound desc = could not find container \"c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce\": container with ID starting with c99210c26f1560669e781890b87b2ef89a14b4d32884a20774a653a1b26a73ce not found: ID does not exist" Apr 24 22:40:21.360972 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.360950 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b"] Apr 24 22:40:21.366345 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.366314 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-1b9dc-predictor-68868569ff-tdd6b"] Apr 24 22:40:21.575696 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:21.575655 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1239da72-897f-4238-97c9-d1096f1fea83" path="/var/lib/kubelet/pods/1239da72-897f-4238-97c9-d1096f1fea83/volumes" Apr 24 22:40:26.344859 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:26.344831 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:40:26.345339 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:26.345312 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 24 22:40:30.174995 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:30.174966 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:40:36.345843 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:36.345796 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 24 22:40:46.345412 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:46.345374 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 24 22:40:56.345352 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.345312 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 24 22:40:56.888012 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.887979 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2"] Apr 24 22:40:56.888301 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.888274 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" containerID="cri-o://4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f" gracePeriod=30 Apr 24 22:40:56.888995 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.888906 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kube-rbac-proxy" containerID="cri-o://638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca" gracePeriod=30 Apr 24 22:40:56.988280 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.988243 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz"] Apr 24 22:40:56.988745 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.988727 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kube-rbac-proxy" Apr 24 22:40:56.988793 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.988750 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kube-rbac-proxy" Apr 24 22:40:56.988793 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.988768 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" Apr 24 22:40:56.988793 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.988776 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" Apr 24 22:40:56.988892 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.988848 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kserve-container" Apr 24 22:40:56.988892 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.988860 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="1239da72-897f-4238-97c9-d1096f1fea83" containerName="kube-rbac-proxy" Apr 24 22:40:56.992108 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.992085 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:56.994640 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.994618 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f8431-predictor-serving-cert\"" Apr 24 22:40:56.994772 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:56.994625 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f8431-kube-rbac-proxy-sar-config\"" Apr 24 22:40:57.003152 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.003130 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz"] Apr 24 22:40:57.075801 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.075764 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9234db16-92ec-497c-9e99-7f02307e6e28-proxy-tls\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.075962 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.075813 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9234db16-92ec-497c-9e99-7f02307e6e28-success-200-isvc-f8431-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.075962 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.075884 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwxvt\" (UniqueName: \"kubernetes.io/projected/9234db16-92ec-497c-9e99-7f02307e6e28-kube-api-access-bwxvt\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.176937 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.176838 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9234db16-92ec-497c-9e99-7f02307e6e28-proxy-tls\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.176937 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.176892 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9234db16-92ec-497c-9e99-7f02307e6e28-success-200-isvc-f8431-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.177165 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.176946 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwxvt\" (UniqueName: \"kubernetes.io/projected/9234db16-92ec-497c-9e99-7f02307e6e28-kube-api-access-bwxvt\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.177570 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.177548 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9234db16-92ec-497c-9e99-7f02307e6e28-success-200-isvc-f8431-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.179360 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.179338 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9234db16-92ec-497c-9e99-7f02307e6e28-proxy-tls\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.187871 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.187845 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwxvt\" (UniqueName: \"kubernetes.io/projected/9234db16-92ec-497c-9e99-7f02307e6e28-kube-api-access-bwxvt\") pod \"success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.305654 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.305612 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:57.422629 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.422568 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz"] Apr 24 22:40:57.425834 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:40:57.425799 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9234db16_92ec_497c_9e99_7f02307e6e28.slice/crio-62ed56906c388b55159dd07c8f5cc22eec5422a52f06bff6f4c92eb67baa15df WatchSource:0}: Error finding container 62ed56906c388b55159dd07c8f5cc22eec5422a52f06bff6f4c92eb67baa15df: Status 404 returned error can't find the container with id 62ed56906c388b55159dd07c8f5cc22eec5422a52f06bff6f4c92eb67baa15df Apr 24 22:40:57.448414 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.448387 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" event={"ID":"9234db16-92ec-497c-9e99-7f02307e6e28","Type":"ContainerStarted","Data":"62ed56906c388b55159dd07c8f5cc22eec5422a52f06bff6f4c92eb67baa15df"} Apr 24 22:40:57.450343 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.450320 2565 generic.go:358] "Generic (PLEG): container finished" podID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerID="638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca" exitCode=2 Apr 24 22:40:57.450437 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:57.450357 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" event={"ID":"79c9f152-cc03-467c-b33e-c1c8523e218d","Type":"ContainerDied","Data":"638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca"} Apr 24 22:40:58.454291 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:58.454257 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" event={"ID":"9234db16-92ec-497c-9e99-7f02307e6e28","Type":"ContainerStarted","Data":"fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4"} Apr 24 22:40:58.454291 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:58.454294 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" event={"ID":"9234db16-92ec-497c-9e99-7f02307e6e28","Type":"ContainerStarted","Data":"a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74"} Apr 24 22:40:58.454844 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:58.454418 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:58.471378 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:58.471321 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podStartSLOduration=2.471306063 podStartE2EDuration="2.471306063s" podCreationTimestamp="2026-04-24 22:40:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:40:58.469765782 +0000 UTC m=+681.485450180" watchObservedRunningTime="2026-04-24 22:40:58.471306063 +0000 UTC m=+681.486990463" Apr 24 22:40:59.457394 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:59.457351 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:40:59.458590 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:40:59.458548 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 24 22:41:00.169455 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:00.169415 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.18:8643/healthz\": dial tcp 10.132.0.18:8643: connect: connection refused" Apr 24 22:41:00.173750 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:00.173727 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 24 22:41:00.459990 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:00.459889 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 24 22:41:04.132588 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.132556 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:41:04.232529 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.232498 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bndxf\" (UniqueName: \"kubernetes.io/projected/79c9f152-cc03-467c-b33e-c1c8523e218d-kube-api-access-bndxf\") pod \"79c9f152-cc03-467c-b33e-c1c8523e218d\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " Apr 24 22:41:04.232702 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.232553 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79c9f152-cc03-467c-b33e-c1c8523e218d-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"79c9f152-cc03-467c-b33e-c1c8523e218d\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " Apr 24 22:41:04.232702 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.232644 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c9f152-cc03-467c-b33e-c1c8523e218d-kserve-provision-location\") pod \"79c9f152-cc03-467c-b33e-c1c8523e218d\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " Apr 24 22:41:04.232702 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.232675 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls\") pod \"79c9f152-cc03-467c-b33e-c1c8523e218d\" (UID: \"79c9f152-cc03-467c-b33e-c1c8523e218d\") " Apr 24 22:41:04.232930 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.232895 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c9f152-cc03-467c-b33e-c1c8523e218d-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "79c9f152-cc03-467c-b33e-c1c8523e218d" (UID: "79c9f152-cc03-467c-b33e-c1c8523e218d"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:41:04.233060 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.232966 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79c9f152-cc03-467c-b33e-c1c8523e218d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "79c9f152-cc03-467c-b33e-c1c8523e218d" (UID: "79c9f152-cc03-467c-b33e-c1c8523e218d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:41:04.234612 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.234571 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c9f152-cc03-467c-b33e-c1c8523e218d-kube-api-access-bndxf" (OuterVolumeSpecName: "kube-api-access-bndxf") pod "79c9f152-cc03-467c-b33e-c1c8523e218d" (UID: "79c9f152-cc03-467c-b33e-c1c8523e218d"). InnerVolumeSpecName "kube-api-access-bndxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:41:04.234850 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.234831 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "79c9f152-cc03-467c-b33e-c1c8523e218d" (UID: "79c9f152-cc03-467c-b33e-c1c8523e218d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:41:04.334094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.334063 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/79c9f152-cc03-467c-b33e-c1c8523e218d-kserve-provision-location\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:41:04.334094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.334092 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/79c9f152-cc03-467c-b33e-c1c8523e218d-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:41:04.334094 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.334102 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bndxf\" (UniqueName: \"kubernetes.io/projected/79c9f152-cc03-467c-b33e-c1c8523e218d-kube-api-access-bndxf\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:41:04.334313 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.334112 2565 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/79c9f152-cc03-467c-b33e-c1c8523e218d-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:41:04.473520 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.473486 2565 generic.go:358] "Generic (PLEG): container finished" podID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerID="4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f" exitCode=0 Apr 24 22:41:04.473708 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.473546 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" event={"ID":"79c9f152-cc03-467c-b33e-c1c8523e218d","Type":"ContainerDied","Data":"4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f"} Apr 24 22:41:04.473708 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.473593 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" event={"ID":"79c9f152-cc03-467c-b33e-c1c8523e218d","Type":"ContainerDied","Data":"3fab459caa182fd5e6ed8b00d08b2e1be218e060b71a6872bc5d2102fc6cc446"} Apr 24 22:41:04.473708 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.473594 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2" Apr 24 22:41:04.473708 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.473615 2565 scope.go:117] "RemoveContainer" containerID="638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca" Apr 24 22:41:04.481886 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.481871 2565 scope.go:117] "RemoveContainer" containerID="4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f" Apr 24 22:41:04.488617 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.488600 2565 scope.go:117] "RemoveContainer" containerID="cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c" Apr 24 22:41:04.493873 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.493851 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2"] Apr 24 22:41:04.495709 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.495664 2565 scope.go:117] "RemoveContainer" containerID="638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca" Apr 24 22:41:04.496036 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:41:04.496010 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca\": container with ID starting with 638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca not found: ID does not exist" containerID="638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca" Apr 24 22:41:04.496148 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.496042 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca"} err="failed to get container status \"638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca\": rpc error: code = NotFound desc = could not find container \"638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca\": container with ID starting with 638ae2f1b87507bde999272f7be6ed1bad68543a3a5b6b09b13abaaf66c9bcca not found: ID does not exist" Apr 24 22:41:04.496148 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.496076 2565 scope.go:117] "RemoveContainer" containerID="4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f" Apr 24 22:41:04.496379 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:41:04.496349 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f\": container with ID starting with 4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f not found: ID does not exist" containerID="4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f" Apr 24 22:41:04.496379 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.496371 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f"} err="failed to get container status \"4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f\": rpc error: code = NotFound desc = could not find container \"4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f\": container with ID starting with 4361b2b8b1b406c120590a3c605d031f2cb997e2082b4831bfe0f87f83b0a21f not found: ID does not exist" Apr 24 22:41:04.496476 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.496385 2565 scope.go:117] "RemoveContainer" containerID="cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c" Apr 24 22:41:04.496724 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:41:04.496680 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c\": container with ID starting with cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c not found: ID does not exist" containerID="cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c" Apr 24 22:41:04.496822 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.496716 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c"} err="failed to get container status \"cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c\": rpc error: code = NotFound desc = could not find container \"cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c\": container with ID starting with cb5e6aef50a1d3aa39e973c9bbc9e3ce3b233d268db08c2673f4b3bce9f2002c not found: ID does not exist" Apr 24 22:41:04.497251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:04.497235 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-p5mg2"] Apr 24 22:41:05.464304 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:05.464275 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:41:05.464805 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:05.464780 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 24 22:41:05.577696 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:05.577662 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" path="/var/lib/kubelet/pods/79c9f152-cc03-467c-b33e-c1c8523e218d/volumes" Apr 24 22:41:06.345728 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:06.345699 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:41:15.465310 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:15.465270 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 24 22:41:25.465822 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:25.465733 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 24 22:41:35.465787 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:35.465747 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 24 22:41:45.465734 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:41:45.465701 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:44:37.542066 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:44:37.542040 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:44:37.544253 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:44:37.544234 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:49:31.914910 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.914868 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq"] Apr 24 22:49:31.915477 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.915270 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" containerID="cri-o://52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193" gracePeriod=30 Apr 24 22:49:31.915477 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.915351 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kube-rbac-proxy" containerID="cri-o://edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4" gracePeriod=30 Apr 24 22:49:31.994134 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994103 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf"] Apr 24 22:49:31.994385 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994374 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="storage-initializer" Apr 24 22:49:31.994428 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994388 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="storage-initializer" Apr 24 22:49:31.994428 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994403 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kube-rbac-proxy" Apr 24 22:49:31.994428 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994408 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kube-rbac-proxy" Apr 24 22:49:31.994428 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994420 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" Apr 24 22:49:31.994428 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994426 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" Apr 24 22:49:31.994572 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994466 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kserve-container" Apr 24 22:49:31.994572 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.994474 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="79c9f152-cc03-467c-b33e-c1c8523e218d" containerName="kube-rbac-proxy" Apr 24 22:49:31.997335 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.997319 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:31.999038 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.999018 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-5c921-predictor-serving-cert\"" Apr 24 22:49:31.999125 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:31.999021 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-5c921-kube-rbac-proxy-sar-config\"" Apr 24 22:49:32.011818 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.011780 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf"] Apr 24 22:49:32.121754 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.121723 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvcqg\" (UniqueName: \"kubernetes.io/projected/0c54d696-b7e8-4f6a-9d10-5868f5f05732-kube-api-access-tvcqg\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.121925 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.121767 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.121925 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.121786 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c54d696-b7e8-4f6a-9d10-5868f5f05732-success-200-isvc-5c921-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.222965 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.222938 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvcqg\" (UniqueName: \"kubernetes.io/projected/0c54d696-b7e8-4f6a-9d10-5868f5f05732-kube-api-access-tvcqg\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.223136 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.222978 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.223136 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.222997 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c54d696-b7e8-4f6a-9d10-5868f5f05732-success-200-isvc-5c921-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.223136 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:49:32.223118 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-5c921-predictor-serving-cert: secret "success-200-isvc-5c921-predictor-serving-cert" not found Apr 24 22:49:32.223311 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:49:32.223193 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls podName:0c54d696-b7e8-4f6a-9d10-5868f5f05732 nodeName:}" failed. No retries permitted until 2026-04-24 22:49:32.723170958 +0000 UTC m=+1195.738855348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls") pod "success-200-isvc-5c921-predictor-8548d846d5-287kf" (UID: "0c54d696-b7e8-4f6a-9d10-5868f5f05732") : secret "success-200-isvc-5c921-predictor-serving-cert" not found Apr 24 22:49:32.223600 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.223559 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c54d696-b7e8-4f6a-9d10-5868f5f05732-success-200-isvc-5c921-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.230710 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.230693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvcqg\" (UniqueName: \"kubernetes.io/projected/0c54d696-b7e8-4f6a-9d10-5868f5f05732-kube-api-access-tvcqg\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.726989 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.726959 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.729374 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.729349 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls\") pod \"success-200-isvc-5c921-predictor-8548d846d5-287kf\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:32.838217 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.838180 2565 generic.go:358] "Generic (PLEG): container finished" podID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerID="edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4" exitCode=2 Apr 24 22:49:32.838376 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.838254 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" event={"ID":"538ad1fd-f113-4c14-aaab-910c458d0d9b","Type":"ContainerDied","Data":"edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4"} Apr 24 22:49:32.907565 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:32.907531 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:33.024187 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.024156 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf"] Apr 24 22:49:33.027360 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:49:33.027330 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c54d696_b7e8_4f6a_9d10_5868f5f05732.slice/crio-c4ff18005c287178856d3ce1672395bfd298a1bb6f2f0f7343093e78f4a7e842 WatchSource:0}: Error finding container c4ff18005c287178856d3ce1672395bfd298a1bb6f2f0f7343093e78f4a7e842: Status 404 returned error can't find the container with id c4ff18005c287178856d3ce1672395bfd298a1bb6f2f0f7343093e78f4a7e842 Apr 24 22:49:33.029023 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.029007 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:49:33.843428 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.843386 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" event={"ID":"0c54d696-b7e8-4f6a-9d10-5868f5f05732","Type":"ContainerStarted","Data":"c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02"} Apr 24 22:49:33.843428 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.843433 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" event={"ID":"0c54d696-b7e8-4f6a-9d10-5868f5f05732","Type":"ContainerStarted","Data":"05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e"} Apr 24 22:49:33.843637 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.843447 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" event={"ID":"0c54d696-b7e8-4f6a-9d10-5868f5f05732","Type":"ContainerStarted","Data":"c4ff18005c287178856d3ce1672395bfd298a1bb6f2f0f7343093e78f4a7e842"} Apr 24 22:49:33.843712 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.843677 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:33.843827 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.843809 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:33.845226 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.845193 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 22:49:33.859804 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:33.859763 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podStartSLOduration=2.8597510120000003 podStartE2EDuration="2.859751012s" podCreationTimestamp="2026-04-24 22:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:49:33.859193168 +0000 UTC m=+1196.874877604" watchObservedRunningTime="2026-04-24 22:49:33.859751012 +0000 UTC m=+1196.875435412" Apr 24 22:49:34.846128 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:34.846093 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 22:49:35.061251 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.061229 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:49:35.141189 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.141110 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpk7p\" (UniqueName: \"kubernetes.io/projected/538ad1fd-f113-4c14-aaab-910c458d0d9b-kube-api-access-hpk7p\") pod \"538ad1fd-f113-4c14-aaab-910c458d0d9b\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " Apr 24 22:49:35.141189 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.141186 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/538ad1fd-f113-4c14-aaab-910c458d0d9b-success-200-isvc-e938d-kube-rbac-proxy-sar-config\") pod \"538ad1fd-f113-4c14-aaab-910c458d0d9b\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " Apr 24 22:49:35.141408 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.141219 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls\") pod \"538ad1fd-f113-4c14-aaab-910c458d0d9b\" (UID: \"538ad1fd-f113-4c14-aaab-910c458d0d9b\") " Apr 24 22:49:35.141552 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.141526 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/538ad1fd-f113-4c14-aaab-910c458d0d9b-success-200-isvc-e938d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-e938d-kube-rbac-proxy-sar-config") pod "538ad1fd-f113-4c14-aaab-910c458d0d9b" (UID: "538ad1fd-f113-4c14-aaab-910c458d0d9b"). InnerVolumeSpecName "success-200-isvc-e938d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:49:35.143213 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.143186 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/538ad1fd-f113-4c14-aaab-910c458d0d9b-kube-api-access-hpk7p" (OuterVolumeSpecName: "kube-api-access-hpk7p") pod "538ad1fd-f113-4c14-aaab-910c458d0d9b" (UID: "538ad1fd-f113-4c14-aaab-910c458d0d9b"). InnerVolumeSpecName "kube-api-access-hpk7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:49:35.143213 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.143195 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "538ad1fd-f113-4c14-aaab-910c458d0d9b" (UID: "538ad1fd-f113-4c14-aaab-910c458d0d9b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:49:35.242591 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.242551 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-e938d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/538ad1fd-f113-4c14-aaab-910c458d0d9b-success-200-isvc-e938d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:49:35.242748 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.242600 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/538ad1fd-f113-4c14-aaab-910c458d0d9b-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:49:35.242748 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.242612 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hpk7p\" (UniqueName: \"kubernetes.io/projected/538ad1fd-f113-4c14-aaab-910c458d0d9b-kube-api-access-hpk7p\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:49:35.849434 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.849348 2565 generic.go:358] "Generic (PLEG): container finished" podID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerID="52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193" exitCode=0 Apr 24 22:49:35.849434 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.849420 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" Apr 24 22:49:35.849921 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.849430 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" event={"ID":"538ad1fd-f113-4c14-aaab-910c458d0d9b","Type":"ContainerDied","Data":"52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193"} Apr 24 22:49:35.849921 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.849467 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq" event={"ID":"538ad1fd-f113-4c14-aaab-910c458d0d9b","Type":"ContainerDied","Data":"563d32d631e5a28ba7db83c84f08cfebd60eda4732cdf352e805374d28c95e0b"} Apr 24 22:49:35.849921 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.849483 2565 scope.go:117] "RemoveContainer" containerID="edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4" Apr 24 22:49:35.857210 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.857193 2565 scope.go:117] "RemoveContainer" containerID="52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193" Apr 24 22:49:35.863672 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.863654 2565 scope.go:117] "RemoveContainer" containerID="edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4" Apr 24 22:49:35.863933 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:49:35.863911 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4\": container with ID starting with edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4 not found: ID does not exist" containerID="edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4" Apr 24 22:49:35.864026 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.863944 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4"} err="failed to get container status \"edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4\": rpc error: code = NotFound desc = could not find container \"edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4\": container with ID starting with edc068bf138151a93b1f2e619965760442a847f53181a32f429d70ef5c418eb4 not found: ID does not exist" Apr 24 22:49:35.864026 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.863967 2565 scope.go:117] "RemoveContainer" containerID="52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193" Apr 24 22:49:35.864503 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:49:35.864255 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193\": container with ID starting with 52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193 not found: ID does not exist" containerID="52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193" Apr 24 22:49:35.864503 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.864290 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193"} err="failed to get container status \"52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193\": rpc error: code = NotFound desc = could not find container \"52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193\": container with ID starting with 52cbe0a601bf6510e3c40ae7d4d3b400065492ce190e963a8d091424beca8193 not found: ID does not exist" Apr 24 22:49:35.865491 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.865474 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq"] Apr 24 22:49:35.870705 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:35.870685 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e938d-predictor-858f86b7bc-ckndq"] Apr 24 22:49:37.559653 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:37.559627 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:49:37.562829 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:37.562809 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:49:37.574553 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:37.574531 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" path="/var/lib/kubelet/pods/538ad1fd-f113-4c14-aaab-910c458d0d9b/volumes" Apr 24 22:49:39.850235 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:39.850208 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:49:39.850666 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:39.850640 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 22:49:49.851042 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:49.851003 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 22:49:59.850732 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:49:59.850693 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 22:50:09.851130 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:09.851092 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 22:50:11.923438 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.923403 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz"] Apr 24 22:50:11.923843 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.923774 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" containerID="cri-o://a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74" gracePeriod=30 Apr 24 22:50:11.923843 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.923800 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kube-rbac-proxy" containerID="cri-o://fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4" gracePeriod=30 Apr 24 22:50:11.952683 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.952647 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8"] Apr 24 22:50:11.953262 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.953241 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kube-rbac-proxy" Apr 24 22:50:11.953262 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.953263 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kube-rbac-proxy" Apr 24 22:50:11.953398 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.953296 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" Apr 24 22:50:11.953398 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.953306 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" Apr 24 22:50:11.953495 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.953436 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kube-rbac-proxy" Apr 24 22:50:11.953495 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.953449 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="538ad1fd-f113-4c14-aaab-910c458d0d9b" containerName="kserve-container" Apr 24 22:50:11.957141 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.957121 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:11.958968 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.958942 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-32c7c-predictor-serving-cert\"" Apr 24 22:50:11.959101 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.959085 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-32c7c-kube-rbac-proxy-sar-config\"" Apr 24 22:50:11.964033 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:11.964010 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8"] Apr 24 22:50:12.115721 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.115689 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxzzg\" (UniqueName: \"kubernetes.io/projected/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-kube-api-access-vxzzg\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.115882 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.115733 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-success-200-isvc-32c7c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.115882 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.115787 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.217036 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.216940 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxzzg\" (UniqueName: \"kubernetes.io/projected/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-kube-api-access-vxzzg\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.217036 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.216995 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-success-200-isvc-32c7c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.217263 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.217046 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.217263 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:50:12.217185 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-serving-cert: secret "success-200-isvc-32c7c-predictor-serving-cert" not found Apr 24 22:50:12.217362 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:50:12.217268 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls podName:4ed68bcb-5c74-4d05-a91c-593e9270ab6d nodeName:}" failed. No retries permitted until 2026-04-24 22:50:12.717247242 +0000 UTC m=+1235.732931623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls") pod "success-200-isvc-32c7c-predictor-57df59874f-7vtc8" (UID: "4ed68bcb-5c74-4d05-a91c-593e9270ab6d") : secret "success-200-isvc-32c7c-predictor-serving-cert" not found Apr 24 22:50:12.217702 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.217681 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-success-200-isvc-32c7c-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.225227 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.225197 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxzzg\" (UniqueName: \"kubernetes.io/projected/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-kube-api-access-vxzzg\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.721495 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.721460 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.723821 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.723799 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls\") pod \"success-200-isvc-32c7c-predictor-57df59874f-7vtc8\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.867416 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.867386 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:12.957136 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.957082 2565 generic.go:358] "Generic (PLEG): container finished" podID="9234db16-92ec-497c-9e99-7f02307e6e28" containerID="fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4" exitCode=2 Apr 24 22:50:12.957546 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.957211 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" event={"ID":"9234db16-92ec-497c-9e99-7f02307e6e28","Type":"ContainerDied","Data":"fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4"} Apr 24 22:50:12.986703 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:12.986630 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8"] Apr 24 22:50:12.989457 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:50:12.989423 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed68bcb_5c74_4d05_a91c_593e9270ab6d.slice/crio-7b67cfe1ff14a50039b310e6f510874e4541653d5d0ebee7e8f7b23b5e371c51 WatchSource:0}: Error finding container 7b67cfe1ff14a50039b310e6f510874e4541653d5d0ebee7e8f7b23b5e371c51: Status 404 returned error can't find the container with id 7b67cfe1ff14a50039b310e6f510874e4541653d5d0ebee7e8f7b23b5e371c51 Apr 24 22:50:13.962457 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:13.962421 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" event={"ID":"4ed68bcb-5c74-4d05-a91c-593e9270ab6d","Type":"ContainerStarted","Data":"e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234"} Apr 24 22:50:13.962457 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:13.962458 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" event={"ID":"4ed68bcb-5c74-4d05-a91c-593e9270ab6d","Type":"ContainerStarted","Data":"7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0"} Apr 24 22:50:13.962457 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:13.962467 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" event={"ID":"4ed68bcb-5c74-4d05-a91c-593e9270ab6d","Type":"ContainerStarted","Data":"7b67cfe1ff14a50039b310e6f510874e4541653d5d0ebee7e8f7b23b5e371c51"} Apr 24 22:50:13.962963 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:13.962541 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:13.978752 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:13.978715 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podStartSLOduration=2.978702232 podStartE2EDuration="2.978702232s" podCreationTimestamp="2026-04-24 22:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:50:13.977931095 +0000 UTC m=+1236.993615505" watchObservedRunningTime="2026-04-24 22:50:13.978702232 +0000 UTC m=+1236.994386632" Apr 24 22:50:14.965844 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:14.965815 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:14.966922 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:14.966894 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 22:50:15.272596 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.272556 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:50:15.443165 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.443123 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9234db16-92ec-497c-9e99-7f02307e6e28-proxy-tls\") pod \"9234db16-92ec-497c-9e99-7f02307e6e28\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " Apr 24 22:50:15.443362 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.443202 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwxvt\" (UniqueName: \"kubernetes.io/projected/9234db16-92ec-497c-9e99-7f02307e6e28-kube-api-access-bwxvt\") pod \"9234db16-92ec-497c-9e99-7f02307e6e28\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " Apr 24 22:50:15.443362 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.443220 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9234db16-92ec-497c-9e99-7f02307e6e28-success-200-isvc-f8431-kube-rbac-proxy-sar-config\") pod \"9234db16-92ec-497c-9e99-7f02307e6e28\" (UID: \"9234db16-92ec-497c-9e99-7f02307e6e28\") " Apr 24 22:50:15.443682 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.443649 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9234db16-92ec-497c-9e99-7f02307e6e28-success-200-isvc-f8431-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-f8431-kube-rbac-proxy-sar-config") pod "9234db16-92ec-497c-9e99-7f02307e6e28" (UID: "9234db16-92ec-497c-9e99-7f02307e6e28"). InnerVolumeSpecName "success-200-isvc-f8431-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:50:15.445248 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.445212 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9234db16-92ec-497c-9e99-7f02307e6e28-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9234db16-92ec-497c-9e99-7f02307e6e28" (UID: "9234db16-92ec-497c-9e99-7f02307e6e28"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:50:15.445367 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.445313 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9234db16-92ec-497c-9e99-7f02307e6e28-kube-api-access-bwxvt" (OuterVolumeSpecName: "kube-api-access-bwxvt") pod "9234db16-92ec-497c-9e99-7f02307e6e28" (UID: "9234db16-92ec-497c-9e99-7f02307e6e28"). InnerVolumeSpecName "kube-api-access-bwxvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:50:15.544287 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.544192 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9234db16-92ec-497c-9e99-7f02307e6e28-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:50:15.544287 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.544234 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bwxvt\" (UniqueName: \"kubernetes.io/projected/9234db16-92ec-497c-9e99-7f02307e6e28-kube-api-access-bwxvt\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:50:15.544287 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.544255 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-f8431-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9234db16-92ec-497c-9e99-7f02307e6e28-success-200-isvc-f8431-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:50:15.969691 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.969660 2565 generic.go:358] "Generic (PLEG): container finished" podID="9234db16-92ec-497c-9e99-7f02307e6e28" containerID="a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74" exitCode=0 Apr 24 22:50:15.970078 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.969730 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" Apr 24 22:50:15.970078 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.969742 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" event={"ID":"9234db16-92ec-497c-9e99-7f02307e6e28","Type":"ContainerDied","Data":"a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74"} Apr 24 22:50:15.970078 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.969778 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz" event={"ID":"9234db16-92ec-497c-9e99-7f02307e6e28","Type":"ContainerDied","Data":"62ed56906c388b55159dd07c8f5cc22eec5422a52f06bff6f4c92eb67baa15df"} Apr 24 22:50:15.970078 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.969794 2565 scope.go:117] "RemoveContainer" containerID="fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4" Apr 24 22:50:15.970282 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.970187 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 22:50:15.977232 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.977214 2565 scope.go:117] "RemoveContainer" containerID="a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74" Apr 24 22:50:15.984078 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.984059 2565 scope.go:117] "RemoveContainer" containerID="fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4" Apr 24 22:50:15.984323 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:50:15.984305 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4\": container with ID starting with fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4 not found: ID does not exist" containerID="fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4" Apr 24 22:50:15.984359 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.984332 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4"} err="failed to get container status \"fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4\": rpc error: code = NotFound desc = could not find container \"fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4\": container with ID starting with fc8be547a85e40fdeb881bcfeeb8f396551310a951fb036c696c27d514cf02b4 not found: ID does not exist" Apr 24 22:50:15.984359 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.984354 2565 scope.go:117] "RemoveContainer" containerID="a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74" Apr 24 22:50:15.984564 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:50:15.984539 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74\": container with ID starting with a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74 not found: ID does not exist" containerID="a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74" Apr 24 22:50:15.984665 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:15.984570 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74"} err="failed to get container status \"a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74\": rpc error: code = NotFound desc = could not find container \"a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74\": container with ID starting with a21e2fa4a1c5364a1e5e9dfd9188a23ff7a218101b99889c7cd6769bf6ad6a74 not found: ID does not exist" Apr 24 22:50:16.008036 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:16.008013 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz"] Apr 24 22:50:16.011780 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:16.011759 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f8431-predictor-6bcffc9d57-nq2nz"] Apr 24 22:50:17.575341 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:17.575310 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" path="/var/lib/kubelet/pods/9234db16-92ec-497c-9e99-7f02307e6e28/volumes" Apr 24 22:50:19.851744 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:19.851716 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:50:20.974857 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:20.974831 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:50:20.975448 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:20.975415 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 22:50:30.975926 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:30.975840 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 22:50:40.976131 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:40.976083 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 22:50:42.228759 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.228722 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf"] Apr 24 22:50:42.229233 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.228991 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" containerID="cri-o://05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e" gracePeriod=30 Apr 24 22:50:42.229233 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.229045 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kube-rbac-proxy" containerID="cri-o://c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02" gracePeriod=30 Apr 24 22:50:42.253758 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.253730 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9"] Apr 24 22:50:42.254021 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.254007 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" Apr 24 22:50:42.254021 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.254021 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" Apr 24 22:50:42.254100 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.254032 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kube-rbac-proxy" Apr 24 22:50:42.254100 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.254037 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kube-rbac-proxy" Apr 24 22:50:42.254100 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.254087 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kube-rbac-proxy" Apr 24 22:50:42.254100 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.254098 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="9234db16-92ec-497c-9e99-7f02307e6e28" containerName="kserve-container" Apr 24 22:50:42.257764 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.257749 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.259629 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.259612 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-4aba3-predictor-serving-cert\"" Apr 24 22:50:42.260009 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.259994 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-4aba3-kube-rbac-proxy-sar-config\"" Apr 24 22:50:42.265372 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.265354 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9"] Apr 24 22:50:42.335968 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.335934 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-success-200-isvc-4aba3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.336112 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.335985 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8hjz\" (UniqueName: \"kubernetes.io/projected/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-kube-api-access-r8hjz\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.336112 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.336077 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-proxy-tls\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.436651 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.436613 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8hjz\" (UniqueName: \"kubernetes.io/projected/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-kube-api-access-r8hjz\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.436824 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.436673 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-proxy-tls\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.436824 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.436715 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-success-200-isvc-4aba3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.437458 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.437411 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-success-200-isvc-4aba3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.439454 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.439423 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-proxy-tls\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.445215 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.445190 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8hjz\" (UniqueName: \"kubernetes.io/projected/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-kube-api-access-r8hjz\") pod \"success-200-isvc-4aba3-predictor-6dcd794499-dzrk9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.567906 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.567815 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:42.686068 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:42.686044 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9"] Apr 24 22:50:42.688432 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:50:42.688396 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47c13b48_ba40_4eb0_b3c4_1e09c63114a9.slice/crio-6344b2599cf82e2a31d6296d00059b4d3fbce856f29bae4f9f03c97e3a728bea WatchSource:0}: Error finding container 6344b2599cf82e2a31d6296d00059b4d3fbce856f29bae4f9f03c97e3a728bea: Status 404 returned error can't find the container with id 6344b2599cf82e2a31d6296d00059b4d3fbce856f29bae4f9f03c97e3a728bea Apr 24 22:50:43.044874 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:43.044843 2565 generic.go:358] "Generic (PLEG): container finished" podID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerID="c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02" exitCode=2 Apr 24 22:50:43.045049 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:43.044917 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" event={"ID":"0c54d696-b7e8-4f6a-9d10-5868f5f05732","Type":"ContainerDied","Data":"c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02"} Apr 24 22:50:43.046640 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:43.046615 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" event={"ID":"47c13b48-ba40-4eb0-b3c4-1e09c63114a9","Type":"ContainerStarted","Data":"fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f"} Apr 24 22:50:43.046739 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:43.046648 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" event={"ID":"47c13b48-ba40-4eb0-b3c4-1e09c63114a9","Type":"ContainerStarted","Data":"c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24"} Apr 24 22:50:43.046739 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:43.046662 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" event={"ID":"47c13b48-ba40-4eb0-b3c4-1e09c63114a9","Type":"ContainerStarted","Data":"6344b2599cf82e2a31d6296d00059b4d3fbce856f29bae4f9f03c97e3a728bea"} Apr 24 22:50:43.046835 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:43.046802 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:43.070231 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:43.070181 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podStartSLOduration=1.070166402 podStartE2EDuration="1.070166402s" podCreationTimestamp="2026-04-24 22:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:50:43.062545086 +0000 UTC m=+1266.078229485" watchObservedRunningTime="2026-04-24 22:50:43.070166402 +0000 UTC m=+1266.085850802" Apr 24 22:50:44.050593 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:44.050543 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:44.051907 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:44.051879 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 22:50:44.847105 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:44.847059 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 24 22:50:45.053332 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.053298 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 22:50:45.267109 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.267088 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:50:45.360383 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.360351 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvcqg\" (UniqueName: \"kubernetes.io/projected/0c54d696-b7e8-4f6a-9d10-5868f5f05732-kube-api-access-tvcqg\") pod \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " Apr 24 22:50:45.360523 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.360396 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c54d696-b7e8-4f6a-9d10-5868f5f05732-success-200-isvc-5c921-kube-rbac-proxy-sar-config\") pod \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " Apr 24 22:50:45.360523 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.360426 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls\") pod \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\" (UID: \"0c54d696-b7e8-4f6a-9d10-5868f5f05732\") " Apr 24 22:50:45.360720 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.360691 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c54d696-b7e8-4f6a-9d10-5868f5f05732-success-200-isvc-5c921-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-5c921-kube-rbac-proxy-sar-config") pod "0c54d696-b7e8-4f6a-9d10-5868f5f05732" (UID: "0c54d696-b7e8-4f6a-9d10-5868f5f05732"). InnerVolumeSpecName "success-200-isvc-5c921-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:50:45.362430 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.362408 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0c54d696-b7e8-4f6a-9d10-5868f5f05732" (UID: "0c54d696-b7e8-4f6a-9d10-5868f5f05732"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:50:45.362491 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.362427 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c54d696-b7e8-4f6a-9d10-5868f5f05732-kube-api-access-tvcqg" (OuterVolumeSpecName: "kube-api-access-tvcqg") pod "0c54d696-b7e8-4f6a-9d10-5868f5f05732" (UID: "0c54d696-b7e8-4f6a-9d10-5868f5f05732"). InnerVolumeSpecName "kube-api-access-tvcqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:50:45.460942 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.460871 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvcqg\" (UniqueName: \"kubernetes.io/projected/0c54d696-b7e8-4f6a-9d10-5868f5f05732-kube-api-access-tvcqg\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:50:45.460942 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.460896 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-5c921-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0c54d696-b7e8-4f6a-9d10-5868f5f05732-success-200-isvc-5c921-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:50:45.460942 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:45.460907 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0c54d696-b7e8-4f6a-9d10-5868f5f05732-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:50:46.057080 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.057049 2565 generic.go:358] "Generic (PLEG): container finished" podID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerID="05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e" exitCode=0 Apr 24 22:50:46.057533 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.057101 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" event={"ID":"0c54d696-b7e8-4f6a-9d10-5868f5f05732","Type":"ContainerDied","Data":"05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e"} Apr 24 22:50:46.057533 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.057124 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" event={"ID":"0c54d696-b7e8-4f6a-9d10-5868f5f05732","Type":"ContainerDied","Data":"c4ff18005c287178856d3ce1672395bfd298a1bb6f2f0f7343093e78f4a7e842"} Apr 24 22:50:46.057533 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.057131 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf" Apr 24 22:50:46.057533 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.057137 2565 scope.go:117] "RemoveContainer" containerID="c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02" Apr 24 22:50:46.064357 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.064341 2565 scope.go:117] "RemoveContainer" containerID="05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e" Apr 24 22:50:46.070695 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.070677 2565 scope.go:117] "RemoveContainer" containerID="c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02" Apr 24 22:50:46.070924 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:50:46.070903 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02\": container with ID starting with c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02 not found: ID does not exist" containerID="c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02" Apr 24 22:50:46.070970 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.070933 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02"} err="failed to get container status \"c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02\": rpc error: code = NotFound desc = could not find container \"c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02\": container with ID starting with c1b25818034bd43ee2ffd81b252928034145ac31b47366466152530501013d02 not found: ID does not exist" Apr 24 22:50:46.070970 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.070949 2565 scope.go:117] "RemoveContainer" containerID="05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e" Apr 24 22:50:46.071160 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:50:46.071146 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e\": container with ID starting with 05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e not found: ID does not exist" containerID="05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e" Apr 24 22:50:46.071202 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.071164 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e"} err="failed to get container status \"05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e\": rpc error: code = NotFound desc = could not find container \"05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e\": container with ID starting with 05f95cb218ca30ae4c09cc13c073d6d001c28959b97ab5ec30ae999b132aff1e not found: ID does not exist" Apr 24 22:50:46.079568 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.079549 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf"] Apr 24 22:50:46.084024 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:46.084005 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-5c921-predictor-8548d846d5-287kf"] Apr 24 22:50:47.575863 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:47.575833 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" path="/var/lib/kubelet/pods/0c54d696-b7e8-4f6a-9d10-5868f5f05732/volumes" Apr 24 22:50:50.058056 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:50.058029 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:50:50.058531 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:50.058507 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 22:50:50.975685 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:50:50.975651 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 22:51:00.058630 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:00.058592 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 22:51:00.976427 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:00.976404 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:51:10.059278 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:10.059238 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 22:51:20.059273 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:20.059232 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 22:51:22.120123 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.120088 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8"] Apr 24 22:51:22.121028 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.120443 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" containerID="cri-o://7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0" gracePeriod=30 Apr 24 22:51:22.121028 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.120614 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kube-rbac-proxy" containerID="cri-o://e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234" gracePeriod=30 Apr 24 22:51:22.182167 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.182133 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t"] Apr 24 22:51:22.182562 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.182539 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" Apr 24 22:51:22.182562 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.182558 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" Apr 24 22:51:22.182668 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.182608 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kube-rbac-proxy" Apr 24 22:51:22.182668 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.182617 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kube-rbac-proxy" Apr 24 22:51:22.182741 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.182683 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kserve-container" Apr 24 22:51:22.182741 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.182700 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c54d696-b7e8-4f6a-9d10-5868f5f05732" containerName="kube-rbac-proxy" Apr 24 22:51:22.187460 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.187434 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.189270 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.189248 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2e6fe-predictor-serving-cert\"" Apr 24 22:51:22.189493 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.189477 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\"" Apr 24 22:51:22.194299 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.194279 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t"] Apr 24 22:51:22.318535 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.318496 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdm78\" (UniqueName: \"kubernetes.io/projected/7b0231fe-2a96-4d74-a5f8-9528010359db-kube-api-access-gdm78\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.318711 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.318588 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b0231fe-2a96-4d74-a5f8-9528010359db-proxy-tls\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.318711 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.318640 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b0231fe-2a96-4d74-a5f8-9528010359db-success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.419252 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.419175 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdm78\" (UniqueName: \"kubernetes.io/projected/7b0231fe-2a96-4d74-a5f8-9528010359db-kube-api-access-gdm78\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.419252 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.419235 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b0231fe-2a96-4d74-a5f8-9528010359db-proxy-tls\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.419457 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.419258 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b0231fe-2a96-4d74-a5f8-9528010359db-success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.419875 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.419850 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b0231fe-2a96-4d74-a5f8-9528010359db-success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.421706 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.421680 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b0231fe-2a96-4d74-a5f8-9528010359db-proxy-tls\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.426130 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.426111 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdm78\" (UniqueName: \"kubernetes.io/projected/7b0231fe-2a96-4d74-a5f8-9528010359db-kube-api-access-gdm78\") pod \"success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.499034 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.498997 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:22.617662 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:22.617631 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t"] Apr 24 22:51:22.620324 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:51:22.620297 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b0231fe_2a96_4d74_a5f8_9528010359db.slice/crio-08e13e62829f653bda6db128b1e52eacf1826577ae0a916b7ff080cbaf04ad57 WatchSource:0}: Error finding container 08e13e62829f653bda6db128b1e52eacf1826577ae0a916b7ff080cbaf04ad57: Status 404 returned error can't find the container with id 08e13e62829f653bda6db128b1e52eacf1826577ae0a916b7ff080cbaf04ad57 Apr 24 22:51:23.165420 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:23.165387 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerID="e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234" exitCode=2 Apr 24 22:51:23.165834 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:23.165457 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" event={"ID":"4ed68bcb-5c74-4d05-a91c-593e9270ab6d","Type":"ContainerDied","Data":"e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234"} Apr 24 22:51:23.166834 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:23.166812 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" event={"ID":"7b0231fe-2a96-4d74-a5f8-9528010359db","Type":"ContainerStarted","Data":"7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3"} Apr 24 22:51:23.166938 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:23.166840 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" event={"ID":"7b0231fe-2a96-4d74-a5f8-9528010359db","Type":"ContainerStarted","Data":"67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a"} Apr 24 22:51:23.166938 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:23.166851 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" event={"ID":"7b0231fe-2a96-4d74-a5f8-9528010359db","Type":"ContainerStarted","Data":"08e13e62829f653bda6db128b1e52eacf1826577ae0a916b7ff080cbaf04ad57"} Apr 24 22:51:23.167027 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:23.166969 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:23.183649 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:23.183609 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podStartSLOduration=1.183593831 podStartE2EDuration="1.183593831s" podCreationTimestamp="2026-04-24 22:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:51:23.182216327 +0000 UTC m=+1306.197900727" watchObservedRunningTime="2026-04-24 22:51:23.183593831 +0000 UTC m=+1306.199278222" Apr 24 22:51:24.169228 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:24.169203 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:24.170455 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:24.170425 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 22:51:25.057729 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.057709 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:51:25.139609 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.139514 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxzzg\" (UniqueName: \"kubernetes.io/projected/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-kube-api-access-vxzzg\") pod \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " Apr 24 22:51:25.139609 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.139587 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls\") pod \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " Apr 24 22:51:25.139827 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.139626 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-success-200-isvc-32c7c-kube-rbac-proxy-sar-config\") pod \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\" (UID: \"4ed68bcb-5c74-4d05-a91c-593e9270ab6d\") " Apr 24 22:51:25.147612 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.142794 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ed68bcb-5c74-4d05-a91c-593e9270ab6d" (UID: "4ed68bcb-5c74-4d05-a91c-593e9270ab6d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:51:25.147919 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.147740 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-kube-api-access-vxzzg" (OuterVolumeSpecName: "kube-api-access-vxzzg") pod "4ed68bcb-5c74-4d05-a91c-593e9270ab6d" (UID: "4ed68bcb-5c74-4d05-a91c-593e9270ab6d"). InnerVolumeSpecName "kube-api-access-vxzzg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:51:25.149048 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.149017 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-success-200-isvc-32c7c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-32c7c-kube-rbac-proxy-sar-config") pod "4ed68bcb-5c74-4d05-a91c-593e9270ab6d" (UID: "4ed68bcb-5c74-4d05-a91c-593e9270ab6d"). InnerVolumeSpecName "success-200-isvc-32c7c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:51:25.173661 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.173628 2565 generic.go:358] "Generic (PLEG): container finished" podID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerID="7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0" exitCode=0 Apr 24 22:51:25.174056 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.173718 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" event={"ID":"4ed68bcb-5c74-4d05-a91c-593e9270ab6d","Type":"ContainerDied","Data":"7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0"} Apr 24 22:51:25.174056 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.173737 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" Apr 24 22:51:25.174056 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.173764 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8" event={"ID":"4ed68bcb-5c74-4d05-a91c-593e9270ab6d","Type":"ContainerDied","Data":"7b67cfe1ff14a50039b310e6f510874e4541653d5d0ebee7e8f7b23b5e371c51"} Apr 24 22:51:25.174056 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.173788 2565 scope.go:117] "RemoveContainer" containerID="e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234" Apr 24 22:51:25.174256 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.174232 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 22:51:25.181179 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.181160 2565 scope.go:117] "RemoveContainer" containerID="7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0" Apr 24 22:51:25.187657 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.187641 2565 scope.go:117] "RemoveContainer" containerID="e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234" Apr 24 22:51:25.187871 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:51:25.187852 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234\": container with ID starting with e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234 not found: ID does not exist" containerID="e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234" Apr 24 22:51:25.187912 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.187879 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234"} err="failed to get container status \"e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234\": rpc error: code = NotFound desc = could not find container \"e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234\": container with ID starting with e535e4960f0fa9b4bf9b86b9087515c7e0125bb722722b58905ad4b01beac234 not found: ID does not exist" Apr 24 22:51:25.187912 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.187896 2565 scope.go:117] "RemoveContainer" containerID="7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0" Apr 24 22:51:25.188112 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:51:25.188097 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0\": container with ID starting with 7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0 not found: ID does not exist" containerID="7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0" Apr 24 22:51:25.188147 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.188117 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0"} err="failed to get container status \"7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0\": rpc error: code = NotFound desc = could not find container \"7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0\": container with ID starting with 7b3347c259e278837cbd8940ac395041cc92d9d9022034d277a8816f7c0c4cf0 not found: ID does not exist" Apr 24 22:51:25.192522 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.192500 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8"] Apr 24 22:51:25.194361 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.194343 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-32c7c-predictor-57df59874f-7vtc8"] Apr 24 22:51:25.240993 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.240958 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxzzg\" (UniqueName: \"kubernetes.io/projected/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-kube-api-access-vxzzg\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:51:25.240993 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.240993 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:51:25.241169 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.241011 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-32c7c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ed68bcb-5c74-4d05-a91c-593e9270ab6d-success-200-isvc-32c7c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 22:51:25.576057 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:25.576025 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" path="/var/lib/kubelet/pods/4ed68bcb-5c74-4d05-a91c-593e9270ab6d/volumes" Apr 24 22:51:30.059723 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:30.059694 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 22:51:30.178844 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:30.178817 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:51:30.179353 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:30.179332 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 22:51:40.180051 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:40.180012 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 22:51:50.179651 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:51:50.179606 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 22:52:00.180364 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:52:00.180280 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 22:52:10.179717 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:52:10.179686 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 22:54:37.578172 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:54:37.578145 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:54:37.581659 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:54:37.581639 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:59:37.597532 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:37.597422 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:59:37.609590 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:37.609558 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 22:59:57.189176 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.189142 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9"] Apr 24 22:59:57.189656 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.189501 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" containerID="cri-o://c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24" gracePeriod=30 Apr 24 22:59:57.189656 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.189598 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kube-rbac-proxy" containerID="cri-o://fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f" gracePeriod=30 Apr 24 22:59:57.279862 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.279825 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp"] Apr 24 22:59:57.280261 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.280244 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" Apr 24 22:59:57.280310 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.280266 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" Apr 24 22:59:57.280310 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.280289 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kube-rbac-proxy" Apr 24 22:59:57.280310 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.280298 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kube-rbac-proxy" Apr 24 22:59:57.280437 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.280375 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kserve-container" Apr 24 22:59:57.280437 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.280387 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ed68bcb-5c74-4d05-a91c-593e9270ab6d" containerName="kube-rbac-proxy" Apr 24 22:59:57.283611 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.283566 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.285322 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.285302 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-44b1e-predictor-serving-cert\"" Apr 24 22:59:57.285425 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.285308 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-44b1e-kube-rbac-proxy-sar-config\"" Apr 24 22:59:57.299364 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.299340 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp"] Apr 24 22:59:57.378770 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.378737 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.378945 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.378783 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5xc2\" (UniqueName: \"kubernetes.io/projected/ab2f9ce5-6e26-405a-809c-d2ab579d7913-kube-api-access-m5xc2\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.378945 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.378878 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab2f9ce5-6e26-405a-809c-d2ab579d7913-success-200-isvc-44b1e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.479463 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.479436 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.479659 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.479477 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5xc2\" (UniqueName: \"kubernetes.io/projected/ab2f9ce5-6e26-405a-809c-d2ab579d7913-kube-api-access-m5xc2\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.479659 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.479522 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab2f9ce5-6e26-405a-809c-d2ab579d7913-success-200-isvc-44b1e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.479659 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:59:57.479612 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-serving-cert: secret "success-200-isvc-44b1e-predictor-serving-cert" not found Apr 24 22:59:57.479791 ip-10-0-142-202 kubenswrapper[2565]: E0424 22:59:57.479686 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls podName:ab2f9ce5-6e26-405a-809c-d2ab579d7913 nodeName:}" failed. No retries permitted until 2026-04-24 22:59:57.979666516 +0000 UTC m=+1820.995350892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls") pod "success-200-isvc-44b1e-predictor-589485df45-9jvsp" (UID: "ab2f9ce5-6e26-405a-809c-d2ab579d7913") : secret "success-200-isvc-44b1e-predictor-serving-cert" not found Apr 24 22:59:57.480118 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.480098 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab2f9ce5-6e26-405a-809c-d2ab579d7913-success-200-isvc-44b1e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.487196 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.487172 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5xc2\" (UniqueName: \"kubernetes.io/projected/ab2f9ce5-6e26-405a-809c-d2ab579d7913-kube-api-access-m5xc2\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.539156 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.539124 2565 generic.go:358] "Generic (PLEG): container finished" podID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerID="fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f" exitCode=2 Apr 24 22:59:57.539316 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.539197 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" event={"ID":"47c13b48-ba40-4eb0-b3c4-1e09c63114a9","Type":"ContainerDied","Data":"fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f"} Apr 24 22:59:57.983389 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.983358 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:57.985774 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:57.985755 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls\") pod \"success-200-isvc-44b1e-predictor-589485df45-9jvsp\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:58.195636 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.195603 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:58.315607 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.315501 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp"] Apr 24 22:59:58.318205 ip-10-0-142-202 kubenswrapper[2565]: W0424 22:59:58.318177 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2f9ce5_6e26_405a_809c_d2ab579d7913.slice/crio-b7e1bd9038ab89f6027df794aeb4fd8398efb278b0f800f51cfd9677a109d014 WatchSource:0}: Error finding container b7e1bd9038ab89f6027df794aeb4fd8398efb278b0f800f51cfd9677a109d014: Status 404 returned error can't find the container with id b7e1bd9038ab89f6027df794aeb4fd8398efb278b0f800f51cfd9677a109d014 Apr 24 22:59:58.320111 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.320094 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:59:58.545349 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.545258 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" event={"ID":"ab2f9ce5-6e26-405a-809c-d2ab579d7913","Type":"ContainerStarted","Data":"501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189"} Apr 24 22:59:58.545349 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.545303 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" event={"ID":"ab2f9ce5-6e26-405a-809c-d2ab579d7913","Type":"ContainerStarted","Data":"2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4"} Apr 24 22:59:58.545349 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.545317 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" event={"ID":"ab2f9ce5-6e26-405a-809c-d2ab579d7913","Type":"ContainerStarted","Data":"b7e1bd9038ab89f6027df794aeb4fd8398efb278b0f800f51cfd9677a109d014"} Apr 24 22:59:58.545660 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.545522 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:58.545718 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.545705 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 22:59:58.546684 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.546649 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 22:59:58.562135 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:58.562094 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podStartSLOduration=1.562082585 podStartE2EDuration="1.562082585s" podCreationTimestamp="2026-04-24 22:59:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:59:58.560405299 +0000 UTC m=+1821.576089696" watchObservedRunningTime="2026-04-24 22:59:58.562082585 +0000 UTC m=+1821.577766983" Apr 24 22:59:59.549242 ip-10-0-142-202 kubenswrapper[2565]: I0424 22:59:59.549201 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 23:00:00.054227 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.054187 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.23:8643/healthz\": dial tcp 10.132.0.23:8643: connect: connection refused" Apr 24 23:00:00.058956 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.058934 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 23:00:00.340387 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.340365 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 23:00:00.402623 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.402592 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-proxy-tls\") pod \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " Apr 24 23:00:00.402787 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.402634 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-success-200-isvc-4aba3-kube-rbac-proxy-sar-config\") pod \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " Apr 24 23:00:00.402787 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.402655 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8hjz\" (UniqueName: \"kubernetes.io/projected/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-kube-api-access-r8hjz\") pod \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\" (UID: \"47c13b48-ba40-4eb0-b3c4-1e09c63114a9\") " Apr 24 23:00:00.403059 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.403033 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-success-200-isvc-4aba3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-4aba3-kube-rbac-proxy-sar-config") pod "47c13b48-ba40-4eb0-b3c4-1e09c63114a9" (UID: "47c13b48-ba40-4eb0-b3c4-1e09c63114a9"). InnerVolumeSpecName "success-200-isvc-4aba3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:00:00.404794 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.404771 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-kube-api-access-r8hjz" (OuterVolumeSpecName: "kube-api-access-r8hjz") pod "47c13b48-ba40-4eb0-b3c4-1e09c63114a9" (UID: "47c13b48-ba40-4eb0-b3c4-1e09c63114a9"). InnerVolumeSpecName "kube-api-access-r8hjz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:00:00.404876 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.404774 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "47c13b48-ba40-4eb0-b3c4-1e09c63114a9" (UID: "47c13b48-ba40-4eb0-b3c4-1e09c63114a9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:00:00.503711 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.503670 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:00:00.503711 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.503707 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-4aba3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-success-200-isvc-4aba3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:00:00.503711 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.503720 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r8hjz\" (UniqueName: \"kubernetes.io/projected/47c13b48-ba40-4eb0-b3c4-1e09c63114a9-kube-api-access-r8hjz\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:00:00.553679 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.553649 2565 generic.go:358] "Generic (PLEG): container finished" podID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerID="c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24" exitCode=0 Apr 24 23:00:00.554062 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.553691 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" event={"ID":"47c13b48-ba40-4eb0-b3c4-1e09c63114a9","Type":"ContainerDied","Data":"c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24"} Apr 24 23:00:00.554062 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.553725 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" event={"ID":"47c13b48-ba40-4eb0-b3c4-1e09c63114a9","Type":"ContainerDied","Data":"6344b2599cf82e2a31d6296d00059b4d3fbce856f29bae4f9f03c97e3a728bea"} Apr 24 23:00:00.554062 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.553724 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9" Apr 24 23:00:00.554062 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.553737 2565 scope.go:117] "RemoveContainer" containerID="fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f" Apr 24 23:00:00.561915 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.561896 2565 scope.go:117] "RemoveContainer" containerID="c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24" Apr 24 23:00:00.568638 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.568622 2565 scope.go:117] "RemoveContainer" containerID="fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f" Apr 24 23:00:00.568868 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:00:00.568850 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f\": container with ID starting with fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f not found: ID does not exist" containerID="fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f" Apr 24 23:00:00.568926 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.568876 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f"} err="failed to get container status \"fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f\": rpc error: code = NotFound desc = could not find container \"fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f\": container with ID starting with fa809bcb6cd36f908ea12fe9b972b5d2298705da366e93ae4b3d99c566eb420f not found: ID does not exist" Apr 24 23:00:00.568926 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.568893 2565 scope.go:117] "RemoveContainer" containerID="c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24" Apr 24 23:00:00.569072 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:00:00.569058 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24\": container with ID starting with c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24 not found: ID does not exist" containerID="c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24" Apr 24 23:00:00.569107 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.569075 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24"} err="failed to get container status \"c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24\": rpc error: code = NotFound desc = could not find container \"c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24\": container with ID starting with c2c4876a5323313b3215d7c5a57838207d085e44fb2611778f5ead77d2c0be24 not found: ID does not exist" Apr 24 23:00:00.572821 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.572801 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9"] Apr 24 23:00:00.575589 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:00.575554 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-4aba3-predictor-6dcd794499-dzrk9"] Apr 24 23:00:01.574312 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:01.574281 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" path="/var/lib/kubelet/pods/47c13b48-ba40-4eb0-b3c4-1e09c63114a9/volumes" Apr 24 23:00:04.554022 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:04.553995 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 23:00:04.554515 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:04.554489 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 23:00:14.554863 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:14.554823 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 23:00:24.554742 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:24.554702 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 23:00:34.555382 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:34.555342 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 23:00:37.085111 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.085075 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t"] Apr 24 23:00:37.085625 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.085367 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" containerID="cri-o://67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a" gracePeriod=30 Apr 24 23:00:37.085625 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.085382 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kube-rbac-proxy" containerID="cri-o://7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3" gracePeriod=30 Apr 24 23:00:37.125307 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.125271 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5"] Apr 24 23:00:37.125586 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.125561 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" Apr 24 23:00:37.125641 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.125592 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" Apr 24 23:00:37.125641 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.125607 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kube-rbac-proxy" Apr 24 23:00:37.125641 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.125613 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kube-rbac-proxy" Apr 24 23:00:37.125733 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.125665 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kserve-container" Apr 24 23:00:37.125733 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.125685 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="47c13b48-ba40-4eb0-b3c4-1e09c63114a9" containerName="kube-rbac-proxy" Apr 24 23:00:37.128518 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.128503 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.130289 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.130261 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d0532-predictor-serving-cert\"" Apr 24 23:00:37.130429 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.130268 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d0532-kube-rbac-proxy-sar-config\"" Apr 24 23:00:37.139890 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.139859 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5"] Apr 24 23:00:37.287420 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.287379 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-proxy-tls\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.287616 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.287431 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-success-200-isvc-d0532-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.287616 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.287549 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tzw\" (UniqueName: \"kubernetes.io/projected/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-kube-api-access-k4tzw\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.388267 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.388172 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tzw\" (UniqueName: \"kubernetes.io/projected/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-kube-api-access-k4tzw\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.388267 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.388217 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-proxy-tls\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.388496 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.388340 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-success-200-isvc-d0532-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.388973 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.388948 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-success-200-isvc-d0532-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.390612 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.390593 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-proxy-tls\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.395331 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.395311 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tzw\" (UniqueName: \"kubernetes.io/projected/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-kube-api-access-k4tzw\") pod \"success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.439991 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.439945 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:37.557411 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.557386 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5"] Apr 24 23:00:37.559791 ip-10-0-142-202 kubenswrapper[2565]: W0424 23:00:37.559766 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549d96f8_2628_49c9_b80f_f3fecb0ca7a9.slice/crio-3878ed6888fe1e858c27e27ba73a50121c539e9a64532da10298a57a8d452db7 WatchSource:0}: Error finding container 3878ed6888fe1e858c27e27ba73a50121c539e9a64532da10298a57a8d452db7: Status 404 returned error can't find the container with id 3878ed6888fe1e858c27e27ba73a50121c539e9a64532da10298a57a8d452db7 Apr 24 23:00:37.657888 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.657861 2565 generic.go:358] "Generic (PLEG): container finished" podID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerID="7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3" exitCode=2 Apr 24 23:00:37.657992 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.657924 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" event={"ID":"7b0231fe-2a96-4d74-a5f8-9528010359db","Type":"ContainerDied","Data":"7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3"} Apr 24 23:00:37.659291 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.659265 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" event={"ID":"549d96f8-2628-49c9-b80f-f3fecb0ca7a9","Type":"ContainerStarted","Data":"df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf"} Apr 24 23:00:37.659401 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:37.659304 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" event={"ID":"549d96f8-2628-49c9-b80f-f3fecb0ca7a9","Type":"ContainerStarted","Data":"3878ed6888fe1e858c27e27ba73a50121c539e9a64532da10298a57a8d452db7"} Apr 24 23:00:38.662950 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:38.662910 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" event={"ID":"549d96f8-2628-49c9-b80f-f3fecb0ca7a9","Type":"ContainerStarted","Data":"4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868"} Apr 24 23:00:38.663318 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:38.663073 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:38.680163 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:38.680107 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podStartSLOduration=1.680092157 podStartE2EDuration="1.680092157s" podCreationTimestamp="2026-04-24 23:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:00:38.678456907 +0000 UTC m=+1861.694141306" watchObservedRunningTime="2026-04-24 23:00:38.680092157 +0000 UTC m=+1861.695776555" Apr 24 23:00:39.665598 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:39.665550 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:39.667050 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:39.667022 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 23:00:40.175762 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.175722 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 24 23:00:40.180062 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.180028 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 23:00:40.343864 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.343837 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 23:00:40.509488 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.509442 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b0231fe-2a96-4d74-a5f8-9528010359db-proxy-tls\") pod \"7b0231fe-2a96-4d74-a5f8-9528010359db\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " Apr 24 23:00:40.509488 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.509499 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b0231fe-2a96-4d74-a5f8-9528010359db-success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\") pod \"7b0231fe-2a96-4d74-a5f8-9528010359db\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " Apr 24 23:00:40.509762 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.509544 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdm78\" (UniqueName: \"kubernetes.io/projected/7b0231fe-2a96-4d74-a5f8-9528010359db-kube-api-access-gdm78\") pod \"7b0231fe-2a96-4d74-a5f8-9528010359db\" (UID: \"7b0231fe-2a96-4d74-a5f8-9528010359db\") " Apr 24 23:00:40.509888 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.509862 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b0231fe-2a96-4d74-a5f8-9528010359db-success-200-isvc-2e6fe-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-2e6fe-kube-rbac-proxy-sar-config") pod "7b0231fe-2a96-4d74-a5f8-9528010359db" (UID: "7b0231fe-2a96-4d74-a5f8-9528010359db"). InnerVolumeSpecName "success-200-isvc-2e6fe-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:00:40.511646 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.511624 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0231fe-2a96-4d74-a5f8-9528010359db-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7b0231fe-2a96-4d74-a5f8-9528010359db" (UID: "7b0231fe-2a96-4d74-a5f8-9528010359db"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:00:40.511770 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.511747 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0231fe-2a96-4d74-a5f8-9528010359db-kube-api-access-gdm78" (OuterVolumeSpecName: "kube-api-access-gdm78") pod "7b0231fe-2a96-4d74-a5f8-9528010359db" (UID: "7b0231fe-2a96-4d74-a5f8-9528010359db"). InnerVolumeSpecName "kube-api-access-gdm78". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:00:40.610524 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.610472 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b0231fe-2a96-4d74-a5f8-9528010359db-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:00:40.610524 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.610516 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7b0231fe-2a96-4d74-a5f8-9528010359db-success-200-isvc-2e6fe-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:00:40.610524 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.610527 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gdm78\" (UniqueName: \"kubernetes.io/projected/7b0231fe-2a96-4d74-a5f8-9528010359db-kube-api-access-gdm78\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:00:40.669325 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.669294 2565 generic.go:358] "Generic (PLEG): container finished" podID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerID="67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a" exitCode=0 Apr 24 23:00:40.669733 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.669369 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" Apr 24 23:00:40.669733 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.669376 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" event={"ID":"7b0231fe-2a96-4d74-a5f8-9528010359db","Type":"ContainerDied","Data":"67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a"} Apr 24 23:00:40.669733 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.669410 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t" event={"ID":"7b0231fe-2a96-4d74-a5f8-9528010359db","Type":"ContainerDied","Data":"08e13e62829f653bda6db128b1e52eacf1826577ae0a916b7ff080cbaf04ad57"} Apr 24 23:00:40.669733 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.669425 2565 scope.go:117] "RemoveContainer" containerID="7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3" Apr 24 23:00:40.669925 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.669890 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 23:00:40.677275 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.677253 2565 scope.go:117] "RemoveContainer" containerID="67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a" Apr 24 23:00:40.684085 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.684069 2565 scope.go:117] "RemoveContainer" containerID="7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3" Apr 24 23:00:40.684327 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:00:40.684310 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3\": container with ID starting with 7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3 not found: ID does not exist" containerID="7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3" Apr 24 23:00:40.684388 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.684333 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3"} err="failed to get container status \"7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3\": rpc error: code = NotFound desc = could not find container \"7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3\": container with ID starting with 7432f50e95c08f68fed522d4559728e8d48ec2f50aa2e3f820e331e560eca0e3 not found: ID does not exist" Apr 24 23:00:40.684388 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.684350 2565 scope.go:117] "RemoveContainer" containerID="67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a" Apr 24 23:00:40.684727 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:00:40.684710 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a\": container with ID starting with 67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a not found: ID does not exist" containerID="67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a" Apr 24 23:00:40.684781 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.684733 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a"} err="failed to get container status \"67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a\": rpc error: code = NotFound desc = could not find container \"67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a\": container with ID starting with 67aaf453d153b8d9cfa57c6bcd33f9c33677b6cef7edc31ebe04c7f3b2736b3a not found: ID does not exist" Apr 24 23:00:40.688097 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.688076 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t"] Apr 24 23:00:40.691056 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:40.691037 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-2e6fe-predictor-66c4d45c7c-xmb5t"] Apr 24 23:00:41.575718 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:41.575681 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" path="/var/lib/kubelet/pods/7b0231fe-2a96-4d74-a5f8-9528010359db/volumes" Apr 24 23:00:44.555713 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:44.555686 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 23:00:45.673943 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:45.673915 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:00:45.674341 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:45.674317 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 23:00:55.675147 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:00:55.675056 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 23:01:05.674562 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:05.674516 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 23:01:07.568617 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.568553 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26"] Apr 24 23:01:07.569065 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.569049 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kube-rbac-proxy" Apr 24 23:01:07.569118 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.569069 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kube-rbac-proxy" Apr 24 23:01:07.569118 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.569089 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" Apr 24 23:01:07.569118 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.569098 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" Apr 24 23:01:07.569214 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.569167 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kserve-container" Apr 24 23:01:07.569214 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.569178 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b0231fe-2a96-4d74-a5f8-9528010359db" containerName="kube-rbac-proxy" Apr 24 23:01:07.573352 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.573323 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.575123 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.575103 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-be7a5-predictor-serving-cert\"" Apr 24 23:01:07.575647 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.575623 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-be7a5-kube-rbac-proxy-sar-config\"" Apr 24 23:01:07.578694 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.578671 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26"] Apr 24 23:01:07.586104 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.586084 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp"] Apr 24 23:01:07.586365 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.586344 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" containerID="cri-o://2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4" gracePeriod=30 Apr 24 23:01:07.586522 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.586471 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kube-rbac-proxy" containerID="cri-o://501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189" gracePeriod=30 Apr 24 23:01:07.614130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.614094 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3891870-cb3c-45aa-a614-aa2168d2fd00-success-200-isvc-be7a5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.614245 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.614151 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rpc5\" (UniqueName: \"kubernetes.io/projected/d3891870-cb3c-45aa-a614-aa2168d2fd00-kube-api-access-5rpc5\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.614288 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.614270 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.714946 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.714917 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3891870-cb3c-45aa-a614-aa2168d2fd00-success-200-isvc-be7a5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.714946 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.714958 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rpc5\" (UniqueName: \"kubernetes.io/projected/d3891870-cb3c-45aa-a614-aa2168d2fd00-kube-api-access-5rpc5\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.715147 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.714992 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.715147 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:01:07.715104 2565 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-serving-cert: secret "success-200-isvc-be7a5-predictor-serving-cert" not found Apr 24 23:01:07.715232 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:01:07.715159 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls podName:d3891870-cb3c-45aa-a614-aa2168d2fd00 nodeName:}" failed. No retries permitted until 2026-04-24 23:01:08.215140844 +0000 UTC m=+1891.230825222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls") pod "success-200-isvc-be7a5-predictor-6654cff85-qtt26" (UID: "d3891870-cb3c-45aa-a614-aa2168d2fd00") : secret "success-200-isvc-be7a5-predictor-serving-cert" not found Apr 24 23:01:07.715507 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.715485 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3891870-cb3c-45aa-a614-aa2168d2fd00-success-200-isvc-be7a5-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.724701 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.724672 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rpc5\" (UniqueName: \"kubernetes.io/projected/d3891870-cb3c-45aa-a614-aa2168d2fd00-kube-api-access-5rpc5\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:07.744275 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.744240 2565 generic.go:358] "Generic (PLEG): container finished" podID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerID="501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189" exitCode=2 Apr 24 23:01:07.744405 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:07.744312 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" event={"ID":"ab2f9ce5-6e26-405a-809c-d2ab579d7913","Type":"ContainerDied","Data":"501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189"} Apr 24 23:01:08.219461 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.219429 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:08.221811 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.221783 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls\") pod \"success-200-isvc-be7a5-predictor-6654cff85-qtt26\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:08.484335 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.484237 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:08.606843 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.606619 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26"] Apr 24 23:01:08.609344 ip-10-0-142-202 kubenswrapper[2565]: W0424 23:01:08.609309 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3891870_cb3c_45aa_a614_aa2168d2fd00.slice/crio-be8d1215f613ab17547ac06e7450edad73d216f8626489f9b8e9b0e87b8e1439 WatchSource:0}: Error finding container be8d1215f613ab17547ac06e7450edad73d216f8626489f9b8e9b0e87b8e1439: Status 404 returned error can't find the container with id be8d1215f613ab17547ac06e7450edad73d216f8626489f9b8e9b0e87b8e1439 Apr 24 23:01:08.749774 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.749677 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" event={"ID":"d3891870-cb3c-45aa-a614-aa2168d2fd00","Type":"ContainerStarted","Data":"461be447c08c21809170bb3b66c1acc8aa60b08eecb50eb7512b4114a3e892f4"} Apr 24 23:01:08.749774 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.749727 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" event={"ID":"d3891870-cb3c-45aa-a614-aa2168d2fd00","Type":"ContainerStarted","Data":"b640a8f62a0181119072472f2cbf695b86d75f6bd460a706b0240641cc3cae85"} Apr 24 23:01:08.749774 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.749742 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" event={"ID":"d3891870-cb3c-45aa-a614-aa2168d2fd00","Type":"ContainerStarted","Data":"be8d1215f613ab17547ac06e7450edad73d216f8626489f9b8e9b0e87b8e1439"} Apr 24 23:01:08.749997 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.749811 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:08.766932 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:08.766884 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podStartSLOduration=1.766869588 podStartE2EDuration="1.766869588s" podCreationTimestamp="2026-04-24 23:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:01:08.765099705 +0000 UTC m=+1891.780784110" watchObservedRunningTime="2026-04-24 23:01:08.766869588 +0000 UTC m=+1891.782553987" Apr 24 23:01:09.549762 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:09.549719 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.25:8643/healthz\": dial tcp 10.132.0.25:8643: connect: connection refused" Apr 24 23:01:09.752128 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:09.752100 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:09.753421 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:09.753392 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 23:01:10.727800 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.727776 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 23:01:10.755505 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.755475 2565 generic.go:358] "Generic (PLEG): container finished" podID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerID="2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4" exitCode=0 Apr 24 23:01:10.755963 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.755553 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" event={"ID":"ab2f9ce5-6e26-405a-809c-d2ab579d7913","Type":"ContainerDied","Data":"2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4"} Apr 24 23:01:10.755963 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.755600 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" event={"ID":"ab2f9ce5-6e26-405a-809c-d2ab579d7913","Type":"ContainerDied","Data":"b7e1bd9038ab89f6027df794aeb4fd8398efb278b0f800f51cfd9677a109d014"} Apr 24 23:01:10.755963 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.755607 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp" Apr 24 23:01:10.755963 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.755622 2565 scope.go:117] "RemoveContainer" containerID="501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189" Apr 24 23:01:10.755963 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.755831 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 23:01:10.763326 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.763309 2565 scope.go:117] "RemoveContainer" containerID="2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4" Apr 24 23:01:10.769920 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.769899 2565 scope.go:117] "RemoveContainer" containerID="501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189" Apr 24 23:01:10.770145 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:01:10.770125 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189\": container with ID starting with 501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189 not found: ID does not exist" containerID="501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189" Apr 24 23:01:10.770203 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.770154 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189"} err="failed to get container status \"501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189\": rpc error: code = NotFound desc = could not find container \"501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189\": container with ID starting with 501ce7f560dd4062c5a29c0929695c3be636ccb0e244cd618c94eb1f6b902189 not found: ID does not exist" Apr 24 23:01:10.770203 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.770173 2565 scope.go:117] "RemoveContainer" containerID="2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4" Apr 24 23:01:10.770403 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:01:10.770388 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4\": container with ID starting with 2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4 not found: ID does not exist" containerID="2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4" Apr 24 23:01:10.770448 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.770408 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4"} err="failed to get container status \"2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4\": rpc error: code = NotFound desc = could not find container \"2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4\": container with ID starting with 2b0fd5c17376b353c0ed13c6e44e42cfb58d4abb94d7eab492f42fb5e22491c4 not found: ID does not exist" Apr 24 23:01:10.837989 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.837888 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab2f9ce5-6e26-405a-809c-d2ab579d7913-success-200-isvc-44b1e-kube-rbac-proxy-sar-config\") pod \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " Apr 24 23:01:10.837989 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.837944 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls\") pod \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " Apr 24 23:01:10.837989 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.837974 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5xc2\" (UniqueName: \"kubernetes.io/projected/ab2f9ce5-6e26-405a-809c-d2ab579d7913-kube-api-access-m5xc2\") pod \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\" (UID: \"ab2f9ce5-6e26-405a-809c-d2ab579d7913\") " Apr 24 23:01:10.838348 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.838301 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab2f9ce5-6e26-405a-809c-d2ab579d7913-success-200-isvc-44b1e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-44b1e-kube-rbac-proxy-sar-config") pod "ab2f9ce5-6e26-405a-809c-d2ab579d7913" (UID: "ab2f9ce5-6e26-405a-809c-d2ab579d7913"). InnerVolumeSpecName "success-200-isvc-44b1e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:01:10.840128 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.840104 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ab2f9ce5-6e26-405a-809c-d2ab579d7913" (UID: "ab2f9ce5-6e26-405a-809c-d2ab579d7913"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:01:10.840552 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.840527 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab2f9ce5-6e26-405a-809c-d2ab579d7913-kube-api-access-m5xc2" (OuterVolumeSpecName: "kube-api-access-m5xc2") pod "ab2f9ce5-6e26-405a-809c-d2ab579d7913" (UID: "ab2f9ce5-6e26-405a-809c-d2ab579d7913"). InnerVolumeSpecName "kube-api-access-m5xc2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:01:10.938800 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.938759 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-44b1e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ab2f9ce5-6e26-405a-809c-d2ab579d7913-success-200-isvc-44b1e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:01:10.938800 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.938793 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ab2f9ce5-6e26-405a-809c-d2ab579d7913-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:01:10.938800 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:10.938806 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5xc2\" (UniqueName: \"kubernetes.io/projected/ab2f9ce5-6e26-405a-809c-d2ab579d7913-kube-api-access-m5xc2\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:01:11.077321 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:11.077289 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp"] Apr 24 23:01:11.080724 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:11.080699 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-44b1e-predictor-589485df45-9jvsp"] Apr 24 23:01:11.575736 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:11.575703 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" path="/var/lib/kubelet/pods/ab2f9ce5-6e26-405a-809c-d2ab579d7913/volumes" Apr 24 23:01:15.674498 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:15.674461 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 23:01:15.760508 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:15.760482 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:01:15.761014 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:15.760985 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 23:01:25.675126 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:25.675096 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:01:25.766258 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:25.762284 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 23:01:35.761571 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:35.761526 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 23:01:45.761880 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:45.761838 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 24 23:01:55.761717 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:01:55.761690 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:04:37.616329 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:04:37.616231 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 23:04:37.628163 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:04:37.628142 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 23:09:37.634306 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:09:37.634196 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 23:09:37.645728 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:09:37.645710 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 23:10:22.337306 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:22.337274 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26"] Apr 24 23:10:22.337966 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:22.337602 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" containerID="cri-o://b640a8f62a0181119072472f2cbf695b86d75f6bd460a706b0240641cc3cae85" gracePeriod=30 Apr 24 23:10:22.337966 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:22.337659 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kube-rbac-proxy" containerID="cri-o://461be447c08c21809170bb3b66c1acc8aa60b08eecb50eb7512b4114a3e892f4" gracePeriod=30 Apr 24 23:10:23.243308 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:23.243272 2565 generic.go:358] "Generic (PLEG): container finished" podID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerID="461be447c08c21809170bb3b66c1acc8aa60b08eecb50eb7512b4114a3e892f4" exitCode=2 Apr 24 23:10:23.243479 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:23.243322 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" event={"ID":"d3891870-cb3c-45aa-a614-aa2168d2fd00","Type":"ContainerDied","Data":"461be447c08c21809170bb3b66c1acc8aa60b08eecb50eb7512b4114a3e892f4"} Apr 24 23:10:25.250466 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.250429 2565 generic.go:358] "Generic (PLEG): container finished" podID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerID="b640a8f62a0181119072472f2cbf695b86d75f6bd460a706b0240641cc3cae85" exitCode=0 Apr 24 23:10:25.250837 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.250497 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" event={"ID":"d3891870-cb3c-45aa-a614-aa2168d2fd00","Type":"ContainerDied","Data":"b640a8f62a0181119072472f2cbf695b86d75f6bd460a706b0240641cc3cae85"} Apr 24 23:10:25.276310 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.276289 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:10:25.303220 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.303153 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rpc5\" (UniqueName: \"kubernetes.io/projected/d3891870-cb3c-45aa-a614-aa2168d2fd00-kube-api-access-5rpc5\") pod \"d3891870-cb3c-45aa-a614-aa2168d2fd00\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " Apr 24 23:10:25.303220 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.303198 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3891870-cb3c-45aa-a614-aa2168d2fd00-success-200-isvc-be7a5-kube-rbac-proxy-sar-config\") pod \"d3891870-cb3c-45aa-a614-aa2168d2fd00\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " Apr 24 23:10:25.303385 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.303244 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls\") pod \"d3891870-cb3c-45aa-a614-aa2168d2fd00\" (UID: \"d3891870-cb3c-45aa-a614-aa2168d2fd00\") " Apr 24 23:10:25.303703 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.303678 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3891870-cb3c-45aa-a614-aa2168d2fd00-success-200-isvc-be7a5-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-be7a5-kube-rbac-proxy-sar-config") pod "d3891870-cb3c-45aa-a614-aa2168d2fd00" (UID: "d3891870-cb3c-45aa-a614-aa2168d2fd00"). InnerVolumeSpecName "success-200-isvc-be7a5-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:10:25.305335 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.305300 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d3891870-cb3c-45aa-a614-aa2168d2fd00" (UID: "d3891870-cb3c-45aa-a614-aa2168d2fd00"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:10:25.305468 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.305447 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3891870-cb3c-45aa-a614-aa2168d2fd00-kube-api-access-5rpc5" (OuterVolumeSpecName: "kube-api-access-5rpc5") pod "d3891870-cb3c-45aa-a614-aa2168d2fd00" (UID: "d3891870-cb3c-45aa-a614-aa2168d2fd00"). InnerVolumeSpecName "kube-api-access-5rpc5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:10:25.404327 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.404290 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3891870-cb3c-45aa-a614-aa2168d2fd00-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:10:25.404327 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.404322 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5rpc5\" (UniqueName: \"kubernetes.io/projected/d3891870-cb3c-45aa-a614-aa2168d2fd00-kube-api-access-5rpc5\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:10:25.404527 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:25.404338 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-be7a5-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d3891870-cb3c-45aa-a614-aa2168d2fd00-success-200-isvc-be7a5-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:10:26.254597 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:26.254548 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" event={"ID":"d3891870-cb3c-45aa-a614-aa2168d2fd00","Type":"ContainerDied","Data":"be8d1215f613ab17547ac06e7450edad73d216f8626489f9b8e9b0e87b8e1439"} Apr 24 23:10:26.255045 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:26.254600 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26" Apr 24 23:10:26.255045 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:26.254615 2565 scope.go:117] "RemoveContainer" containerID="461be447c08c21809170bb3b66c1acc8aa60b08eecb50eb7512b4114a3e892f4" Apr 24 23:10:26.261939 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:26.261922 2565 scope.go:117] "RemoveContainer" containerID="b640a8f62a0181119072472f2cbf695b86d75f6bd460a706b0240641cc3cae85" Apr 24 23:10:26.268543 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:26.268521 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26"] Apr 24 23:10:26.271780 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:26.271759 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-be7a5-predictor-6654cff85-qtt26"] Apr 24 23:10:27.579460 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:10:27.579426 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" path="/var/lib/kubelet/pods/d3891870-cb3c-45aa-a614-aa2168d2fd00/volumes" Apr 24 23:14:37.651262 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:14:37.651155 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 23:14:37.663134 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:14:37.663108 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 23:17:56.796472 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:56.796390 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5"] Apr 24 23:17:56.797079 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:56.796798 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" containerID="cri-o://df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf" gracePeriod=30 Apr 24 23:17:56.797079 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:56.796846 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kube-rbac-proxy" containerID="cri-o://4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868" gracePeriod=30 Apr 24 23:17:57.464958 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:57.464926 2565 generic.go:358] "Generic (PLEG): container finished" podID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerID="4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868" exitCode=2 Apr 24 23:17:57.465133 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:57.465001 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" event={"ID":"549d96f8-2628-49c9-b80f-f3fecb0ca7a9","Type":"ContainerDied","Data":"4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868"} Apr 24 23:17:59.741988 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.741961 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:17:59.865314 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.865229 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-success-200-isvc-d0532-kube-rbac-proxy-sar-config\") pod \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " Apr 24 23:17:59.865314 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.865274 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-proxy-tls\") pod \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " Apr 24 23:17:59.865314 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.865310 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4tzw\" (UniqueName: \"kubernetes.io/projected/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-kube-api-access-k4tzw\") pod \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\" (UID: \"549d96f8-2628-49c9-b80f-f3fecb0ca7a9\") " Apr 24 23:17:59.865673 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.865643 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-success-200-isvc-d0532-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d0532-kube-rbac-proxy-sar-config") pod "549d96f8-2628-49c9-b80f-f3fecb0ca7a9" (UID: "549d96f8-2628-49c9-b80f-f3fecb0ca7a9"). InnerVolumeSpecName "success-200-isvc-d0532-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:17:59.867448 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.867421 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "549d96f8-2628-49c9-b80f-f3fecb0ca7a9" (UID: "549d96f8-2628-49c9-b80f-f3fecb0ca7a9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:17:59.867543 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.867469 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-kube-api-access-k4tzw" (OuterVolumeSpecName: "kube-api-access-k4tzw") pod "549d96f8-2628-49c9-b80f-f3fecb0ca7a9" (UID: "549d96f8-2628-49c9-b80f-f3fecb0ca7a9"). InnerVolumeSpecName "kube-api-access-k4tzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:17:59.966325 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.966289 2565 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d0532-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-success-200-isvc-d0532-kube-rbac-proxy-sar-config\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:17:59.966325 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.966319 2565 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-proxy-tls\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:17:59.966325 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:17:59.966334 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4tzw\" (UniqueName: \"kubernetes.io/projected/549d96f8-2628-49c9-b80f-f3fecb0ca7a9-kube-api-access-k4tzw\") on node \"ip-10-0-142-202.ec2.internal\" DevicePath \"\"" Apr 24 23:18:00.473866 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.473834 2565 generic.go:358] "Generic (PLEG): container finished" podID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerID="df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf" exitCode=0 Apr 24 23:18:00.474065 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.473891 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" event={"ID":"549d96f8-2628-49c9-b80f-f3fecb0ca7a9","Type":"ContainerDied","Data":"df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf"} Apr 24 23:18:00.474065 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.473990 2565 scope.go:117] "RemoveContainer" containerID="4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868" Apr 24 23:18:00.474246 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.473918 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" event={"ID":"549d96f8-2628-49c9-b80f-f3fecb0ca7a9","Type":"ContainerDied","Data":"3878ed6888fe1e858c27e27ba73a50121c539e9a64532da10298a57a8d452db7"} Apr 24 23:18:00.474379 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.474364 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5" Apr 24 23:18:00.487370 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.487351 2565 scope.go:117] "RemoveContainer" containerID="df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf" Apr 24 23:18:00.497804 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.497772 2565 scope.go:117] "RemoveContainer" containerID="4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868" Apr 24 23:18:00.498107 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:18:00.498082 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868\": container with ID starting with 4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868 not found: ID does not exist" containerID="4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868" Apr 24 23:18:00.498207 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.498119 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868"} err="failed to get container status \"4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868\": rpc error: code = NotFound desc = could not find container \"4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868\": container with ID starting with 4c19de2e614f468b53161f9d3d4d2b64e848844fadd1a49fc574cc0118c3f868 not found: ID does not exist" Apr 24 23:18:00.498207 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.498144 2565 scope.go:117] "RemoveContainer" containerID="df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf" Apr 24 23:18:00.498538 ip-10-0-142-202 kubenswrapper[2565]: E0424 23:18:00.498512 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf\": container with ID starting with df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf not found: ID does not exist" containerID="df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf" Apr 24 23:18:00.498695 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.498544 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf"} err="failed to get container status \"df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf\": rpc error: code = NotFound desc = could not find container \"df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf\": container with ID starting with df0e166161c4ba0b6a62a9e778e29bdb752cd4efde86ca6a2f599a227c9a3ecf not found: ID does not exist" Apr 24 23:18:00.498695 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.498679 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5"] Apr 24 23:18:00.499623 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:00.499601 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d0532-predictor-6cfcdcc985-fjlp5"] Apr 24 23:18:01.575556 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:01.575523 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" path="/var/lib/kubelet/pods/549d96f8-2628-49c9-b80f-f3fecb0ca7a9/volumes" Apr 24 23:18:25.519824 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:25.519795 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-h2lq6_f1034778-383a-47c7-b317-b6284cb34a98/global-pull-secret-syncer/0.log" Apr 24 23:18:25.597178 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:25.597134 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tk5jk_9179cf17-8c50-479c-8b4c-bb3b4144c5ec/konnectivity-agent/0.log" Apr 24 23:18:25.695744 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:25.695713 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-202.ec2.internal_90b166908a0de8e6af429a39155ebb21/haproxy/0.log" Apr 24 23:18:29.679567 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:29.679533 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x7xzg_f843b3fb-0ad4-46bb-b8ee-e06629cf2430/node-exporter/0.log" Apr 24 23:18:29.701207 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:29.701176 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x7xzg_f843b3fb-0ad4-46bb-b8ee-e06629cf2430/kube-rbac-proxy/0.log" Apr 24 23:18:29.722969 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:29.722936 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x7xzg_f843b3fb-0ad4-46bb-b8ee-e06629cf2430/init-textfile/0.log" Apr 24 23:18:31.492830 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:31.492804 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-txzvf_2f6909d3-6ce8-4b0c-986f-82f40f5d2330/networking-console-plugin/0.log" Apr 24 23:18:32.990741 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.990707 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62"] Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.990968 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kube-rbac-proxy" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.990979 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kube-rbac-proxy" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.990987 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.990992 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991006 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kube-rbac-proxy" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991011 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kube-rbac-proxy" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991022 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991028 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991035 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991040 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991047 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kube-rbac-proxy" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991053 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kube-rbac-proxy" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991088 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991095 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991101 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kserve-container" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991109 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab2f9ce5-6e26-405a-809c-d2ab579d7913" containerName="kube-rbac-proxy" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991115 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="549d96f8-2628-49c9-b80f-f3fecb0ca7a9" containerName="kube-rbac-proxy" Apr 24 23:18:32.991130 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.991122 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3891870-cb3c-45aa-a614-aa2168d2fd00" containerName="kube-rbac-proxy" Apr 24 23:18:32.993991 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.993969 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:32.995786 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.995759 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jpd6k\"/\"default-dockercfg-fs845\"" Apr 24 23:18:32.996190 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.996154 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jpd6k\"/\"openshift-service-ca.crt\"" Apr 24 23:18:32.996310 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:32.996188 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jpd6k\"/\"kube-root-ca.crt\"" Apr 24 23:18:33.000948 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.000918 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-proc\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.001124 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.000966 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-podres\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.001124 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.001037 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-lib-modules\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.001250 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.001137 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfs55\" (UniqueName: \"kubernetes.io/projected/a654e38b-caa5-4bba-89f4-c8e174e22524-kube-api-access-vfs55\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.001250 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.001181 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-sys\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.003255 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.003230 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62"] Apr 24 23:18:33.101840 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.101805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-proc\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.101840 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.101844 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-podres\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.102061 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.101871 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-lib-modules\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.102061 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.101912 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfs55\" (UniqueName: \"kubernetes.io/projected/a654e38b-caa5-4bba-89f4-c8e174e22524-kube-api-access-vfs55\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.102061 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.101949 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-sys\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.102061 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.101914 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-proc\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.102061 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.102014 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-podres\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.102061 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.102025 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-lib-modules\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.102061 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.102038 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a654e38b-caa5-4bba-89f4-c8e174e22524-sys\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.109940 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.109922 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfs55\" (UniqueName: \"kubernetes.io/projected/a654e38b-caa5-4bba-89f4-c8e174e22524-kube-api-access-vfs55\") pod \"perf-node-gather-daemonset-75b62\" (UID: \"a654e38b-caa5-4bba-89f4-c8e174e22524\") " pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.304719 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.304635 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.425345 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.425316 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62"] Apr 24 23:18:33.427866 ip-10-0-142-202 kubenswrapper[2565]: W0424 23:18:33.427835 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda654e38b_caa5_4bba_89f4_c8e174e22524.slice/crio-7de43b583cda19df87b7bf30ad179b1c5e27cb2be86f7746da4eccfb8874f846 WatchSource:0}: Error finding container 7de43b583cda19df87b7bf30ad179b1c5e27cb2be86f7746da4eccfb8874f846: Status 404 returned error can't find the container with id 7de43b583cda19df87b7bf30ad179b1c5e27cb2be86f7746da4eccfb8874f846 Apr 24 23:18:33.429317 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.429300 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:18:33.556842 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.556744 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t4vp4_a30d41a7-8c4f-4b0c-9cc0-a92a394596fe/dns/0.log" Apr 24 23:18:33.563783 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.563751 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" event={"ID":"a654e38b-caa5-4bba-89f4-c8e174e22524","Type":"ContainerStarted","Data":"e54b12163fca84dc8b83019be9d0ae08a7ba4ba24c062b4a7d8291162a5e22d0"} Apr 24 23:18:33.563943 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.563801 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" event={"ID":"a654e38b-caa5-4bba-89f4-c8e174e22524","Type":"ContainerStarted","Data":"7de43b583cda19df87b7bf30ad179b1c5e27cb2be86f7746da4eccfb8874f846"} Apr 24 23:18:33.563943 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.563838 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:33.579224 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.579171 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" podStartSLOduration=1.579153982 podStartE2EDuration="1.579153982s" podCreationTimestamp="2026-04-24 23:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:18:33.578012409 +0000 UTC m=+2936.593696810" watchObservedRunningTime="2026-04-24 23:18:33.579153982 +0000 UTC m=+2936.594838381" Apr 24 23:18:33.580007 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.579984 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-t4vp4_a30d41a7-8c4f-4b0c-9cc0-a92a394596fe/kube-rbac-proxy/0.log" Apr 24 23:18:33.641104 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:33.641073 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ztc7p_eed2cad7-a1f7-4e91-826b-b6ca9587c1c9/dns-node-resolver/0.log" Apr 24 23:18:34.053970 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:34.053939 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-c2mrs_2f1645d7-db01-4339-bbd8-b67eb1828971/node-ca/0.log" Apr 24 23:18:35.123457 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:35.123420 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7pxrr_57eedb74-e256-4612-845f-7dc838139e1f/serve-healthcheck-canary/0.log" Apr 24 23:18:35.592425 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:35.592392 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mmbbr_c41cb4fe-11d2-4017-a2e0-a3ba6697dc85/kube-rbac-proxy/0.log" Apr 24 23:18:35.611614 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:35.611563 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mmbbr_c41cb4fe-11d2-4017-a2e0-a3ba6697dc85/exporter/0.log" Apr 24 23:18:35.631101 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:35.631068 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mmbbr_c41cb4fe-11d2-4017-a2e0-a3ba6697dc85/extractor/0.log" Apr 24 23:18:37.737167 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:37.737133 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-6fsph_a096ba47-2ec2-494d-830b-c280b7ccced6/manager/0.log" Apr 24 23:18:37.994698 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:37.994660 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-82wxw_ba03a81b-c63a-4992-80ed-f1a2fcadc8f6/manager/0.log" Apr 24 23:18:39.578361 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:39.578334 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jpd6k/perf-node-gather-daemonset-75b62" Apr 24 23:18:43.433755 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.433724 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-swjrh_23b22676-e2ab-4cd8-97f3-c119a27160e7/kube-multus-additional-cni-plugins/0.log" Apr 24 23:18:43.455723 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.455695 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-swjrh_23b22676-e2ab-4cd8-97f3-c119a27160e7/egress-router-binary-copy/0.log" Apr 24 23:18:43.536904 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.536866 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-swjrh_23b22676-e2ab-4cd8-97f3-c119a27160e7/cni-plugins/0.log" Apr 24 23:18:43.605012 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.604958 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-swjrh_23b22676-e2ab-4cd8-97f3-c119a27160e7/bond-cni-plugin/0.log" Apr 24 23:18:43.670266 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.670244 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-swjrh_23b22676-e2ab-4cd8-97f3-c119a27160e7/routeoverride-cni/0.log" Apr 24 23:18:43.690887 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.690808 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-swjrh_23b22676-e2ab-4cd8-97f3-c119a27160e7/whereabouts-cni-bincopy/0.log" Apr 24 23:18:43.710869 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.710843 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-swjrh_23b22676-e2ab-4cd8-97f3-c119a27160e7/whereabouts-cni/0.log" Apr 24 23:18:43.739925 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.739856 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pmbqp_22314c26-39e2-493f-a8fc-95107d7fe18b/kube-multus/0.log" Apr 24 23:18:43.881524 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.881496 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9bjhd_ffaace54-8d28-433c-b3bf-e5664064b07e/network-metrics-daemon/0.log" Apr 24 23:18:43.901297 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:43.901273 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9bjhd_ffaace54-8d28-433c-b3bf-e5664064b07e/kube-rbac-proxy/0.log" Apr 24 23:18:45.237097 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.237067 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-controller/0.log" Apr 24 23:18:45.261423 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.261390 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/0.log" Apr 24 23:18:45.274729 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.274694 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovn-acl-logging/1.log" Apr 24 23:18:45.293367 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.293340 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/kube-rbac-proxy-node/0.log" Apr 24 23:18:45.315566 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.315519 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 23:18:45.335709 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.335667 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/northd/0.log" Apr 24 23:18:45.359124 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.359093 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/nbdb/0.log" Apr 24 23:18:45.378359 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.378309 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/sbdb/0.log" Apr 24 23:18:45.477754 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:45.477695 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sm54g_a1918ea1-23d1-4627-af99-2e000c93ecfd/ovnkube-controller/0.log" Apr 24 23:18:46.469985 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:46.469956 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lgf7j_01f08b4f-503c-494e-836c-e58cbfde457a/network-check-target-container/0.log" Apr 24 23:18:47.408851 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:47.408821 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-gkgwr_6ec7071c-10cb-4085-847c-cb8ce4b31cb9/iptables-alerter/0.log" Apr 24 23:18:48.099817 ip-10-0-142-202 kubenswrapper[2565]: I0424 23:18:48.099784 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lsvxp_813a2f60-88c4-4200-a499-4d307772c734/tuned/0.log"