Apr 22 18:44:04.898387 ip-10-0-138-84 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:44:04.898400 ip-10-0-138-84 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:44:04.898406 ip-10-0-138-84 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:44:04.898635 ip-10-0-138-84 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:44:14.947632 ip-10-0-138-84 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:44:14.947643 ip-10-0-138-84 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 25e87cba434e46129ab344a2ba394f11 -- Apr 22 18:46:21.536338 ip-10-0-138-84 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:21.987244 ip-10-0-138-84 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:21.987244 ip-10-0-138-84 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:21.987244 ip-10-0-138-84 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:21.987244 ip-10-0-138-84 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:21.987244 ip-10-0-138-84 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:21.990544 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.990449 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:21.994673 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994648 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:21.994673 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994670 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:21.994673 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994677 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:21.994673 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994682 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994686 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994691 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994695 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994699 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994703 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994706 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994710 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994713 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994717 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994721 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994725 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994729 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994733 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994736 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994740 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994744 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994748 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994752 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994755 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994759 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:21.995062 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994763 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994767 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994771 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994775 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994778 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994788 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994792 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994796 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994800 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994804 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994808 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994812 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994817 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994822 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994826 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994830 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994834 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994838 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994842 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994846 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:21.995916 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994850 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994854 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994859 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994863 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994867 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994871 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994875 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994879 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994883 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994887 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994891 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994912 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994916 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994920 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994924 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994928 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994932 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994936 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994940 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:21.996567 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994944 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994948 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994952 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994956 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994961 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994968 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994975 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994982 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994987 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994992 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.994996 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995001 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995008 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995013 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995017 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995023 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995028 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995032 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995037 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:21.997079 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995043 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995047 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995052 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995056 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995715 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995723 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995728 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995732 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995736 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995741 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995745 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995749 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995754 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995758 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995762 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995768 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995772 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995776 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995780 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995786 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:21.997857 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995790 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995794 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995798 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995802 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995806 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995810 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995814 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995819 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995824 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995828 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995833 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995837 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995841 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995846 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995850 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995854 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995859 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995862 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995867 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995871 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:21.998360 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995875 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995879 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995884 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995890 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995914 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995919 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995923 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995928 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995931 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995935 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995940 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995946 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995951 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995955 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995959 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995963 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995974 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995980 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995984 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995989 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:21.999073 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.995993 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996000 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996004 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996009 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996013 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996017 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996021 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996025 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996029 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996033 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996038 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996042 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996046 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996050 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996054 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996059 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996063 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996067 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996071 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996075 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:21.999702 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996080 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996084 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996088 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996093 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996097 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996106 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996112 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996117 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996122 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.996126 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997068 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997084 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997094 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997101 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997110 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997116 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997123 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997130 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997135 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997140 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:22.000285 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997146 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997151 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997157 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997162 2572 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997167 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997171 2572 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997176 2572 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997180 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997185 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997193 2572 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997198 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997204 2572 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997208 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997214 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997221 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997226 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997231 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997236 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997241 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997246 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997251 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997256 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997261 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997267 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997274 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:22.000890 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997279 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997284 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997290 2572 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997295 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997301 2572 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997306 2572 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997311 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997316 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997321 2572 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997327 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997332 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997337 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997342 2572 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997347 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997352 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997357 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997367 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997372 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997376 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997381 2572 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997387 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997392 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997397 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997403 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997408 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:22.001696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997413 2572 flags.go:64] FLAG: --help="false" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997418 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997423 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997428 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997433 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997438 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997445 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997449 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997454 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997459 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997464 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997469 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997474 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997479 2572 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997484 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997488 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997493 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997498 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997502 2572 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997507 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997512 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997517 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997526 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:22.002355 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997532 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997537 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997542 2572 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997546 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997552 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997557 2572 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997562 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997569 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997574 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997581 2572 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997586 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997590 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997595 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997600 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997605 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997610 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997615 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997626 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997631 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997636 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997642 2572 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997646 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997654 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997659 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:22.003035 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997664 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997669 2572 flags.go:64] FLAG: --port="10250" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997673 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997678 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0539d5db5dfb0fd2d" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997683 2572 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997691 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997696 2572 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997701 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997706 2572 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997723 2572 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997728 2572 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997733 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997737 2572 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997743 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997748 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997753 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997758 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.997763 2572 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998624 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998636 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998642 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998648 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998653 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998658 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998663 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998668 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:22.003657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998673 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998678 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998683 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998689 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998694 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998699 2572 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998704 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998714 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998718 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998723 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998731 2572 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998739 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998743 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998748 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998753 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998758 2572 flags.go:64] FLAG: --v="2" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998765 2572 flags.go:64] FLAG: --version="false" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998772 2572 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998778 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:21.998784 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998947 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998956 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998962 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998966 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:22.004319 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998971 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998975 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998979 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998984 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998987 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998992 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.998996 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999000 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999004 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999009 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999013 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999017 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999023 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999027 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999031 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999035 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999039 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999043 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999048 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:22.004913 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999054 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999058 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999062 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999067 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999071 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999075 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999079 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999083 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999087 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999092 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999096 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999100 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999105 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999110 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999114 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999119 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999123 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999127 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999131 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999135 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:22.005422 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999139 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999143 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999147 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999152 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999156 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999161 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999166 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999170 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999174 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999178 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999182 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999186 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999191 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999194 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999199 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999202 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999207 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999211 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999215 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999219 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:22.005975 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999223 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999227 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999231 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999236 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999240 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999245 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999249 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999253 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999260 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999265 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999269 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999273 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999277 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999282 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999286 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999290 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999294 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999299 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999303 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999308 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:22.006467 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999312 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999316 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:21.999323 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.000121 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.006765 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.006787 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006832 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006837 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006841 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006845 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006848 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006851 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006853 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006856 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006859 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:22.006969 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006862 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006864 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006867 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006870 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006872 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006875 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006878 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006880 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006884 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006886 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006890 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006893 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006908 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006911 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006913 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006917 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006922 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006925 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006928 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006931 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:22.007387 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006934 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006937 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006941 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006944 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006947 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006949 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006952 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006956 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006960 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006963 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006966 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006969 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006971 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006974 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006978 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006980 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006983 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006986 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006988 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:22.007876 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006991 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006994 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006997 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.006999 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007002 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007005 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007007 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007010 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007012 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007015 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007018 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007020 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007023 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007025 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007028 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007031 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007034 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007037 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007039 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007042 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:22.008366 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007044 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007047 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007049 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007052 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007054 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007057 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007059 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007062 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007064 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007067 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007069 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007072 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007074 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007077 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007079 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007082 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007085 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:22.008851 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007087 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.007092 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007189 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007193 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007196 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007199 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007201 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007204 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007207 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007210 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007213 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007216 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007219 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007222 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007224 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007227 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:22.009284 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007229 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007232 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007235 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007237 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007240 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007242 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007245 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007247 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007250 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007252 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007255 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007257 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007261 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007265 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007268 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007271 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007274 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007277 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007280 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:22.009681 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007283 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007286 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007289 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007291 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007294 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007297 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007299 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007302 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007305 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007308 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007311 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007313 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007316 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007318 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007320 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007323 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007326 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007328 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007331 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007333 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:22.010198 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007336 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007338 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007341 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007343 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007346 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007348 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007351 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007354 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007356 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007359 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007361 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007363 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007367 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007370 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007373 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007375 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007378 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007381 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007384 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:22.010672 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007387 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007389 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007392 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007395 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007398 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007401 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007403 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007406 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007408 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007411 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007413 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007416 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007418 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:22.007421 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.007426 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:22.011162 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.008182 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:22.011544 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.010189 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:22.011544 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.011158 2572 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:22.011544 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.011250 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:22.011544 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.011284 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:22.036098 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.036075 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:22.038768 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.038748 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:22.057302 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.057277 2572 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:22.062598 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.062572 2572 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:22.064493 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.064477 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:22.069058 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.069034 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a01d3acd-1a49-4cdb-9707-7d0b69dd28c9:/dev/nvme0n1p3 f10bc7bd-a365-437d-a9c1-340d9357eea2:/dev/nvme0n1p4] Apr 22 18:46:22.069147 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.069059 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:22.072054 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.072037 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:22.075679 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.075560 2572 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:22.072990361 +0000 UTC m=+0.410976202 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098536 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a98b9d823c84926d70c54526e9514 SystemUUID:ec2a98b9-d823-c849-26d7-0c54526e9514 BootID:25e87cba-434e-4612-9ab3-44a2ba394f11 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:10:c3:64:20:df Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:10:c3:64:20:df Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:99:0a:61:48:ee Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:22.075679 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.075674 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:22.075791 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.075762 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:22.077476 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.077451 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:22.077621 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.077479 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-84.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:22.077663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.077630 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:22.077663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.077638 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:22.077663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.077657 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:22.078379 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.078369 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:22.079642 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.079631 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:22.079989 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.079979 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:22.082445 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.082434 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:22.082479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.082450 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:22.082479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.082463 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:22.082479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.082472 2572 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:22.082567 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.082484 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:22.083661 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.083648 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:22.083698 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.083672 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:22.086930 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.086914 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:22.088534 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.088519 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:22.089665 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089652 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089672 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089681 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089690 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089699 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089707 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089716 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089725 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089736 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089744 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:22.089754 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089757 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:22.090101 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.089770 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:22.090742 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.090730 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:22.090791 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.090745 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:22.094681 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.094657 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:22.094782 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.094726 2572 server.go:1295] "Started kubelet" Apr 22 18:46:22.095743 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.095282 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:22.095743 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.095718 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:22.095879 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.095774 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:22.095811 ip-10-0-138-84 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:22.097093 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.097074 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-84.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:22.097145 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.097096 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-84.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:22.097178 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.097165 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:22.098487 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.098460 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:22.099152 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.099133 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:22.104828 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.104810 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:22.104955 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.103575 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-84.ec2.internal.18a8c2341dfa1cde default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-84.ec2.internal,UID:ip-10-0-138-84.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-84.ec2.internal,},FirstTimestamp:2026-04-22 18:46:22.094679262 +0000 UTC m=+0.432665101,LastTimestamp:2026-04-22 18:46:22.094679262 +0000 UTC m=+0.432665101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-84.ec2.internal,}" Apr 22 18:46:22.104955 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.104839 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:22.105591 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.105574 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:22.105705 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.105578 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:22.105705 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.105708 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:22.105848 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.105682 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.105848 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.105753 2572 factory.go:55] Registering systemd factory Apr 22 18:46:22.105848 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.105772 2572 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:22.105848 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.105787 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:22.105848 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.105796 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:22.106093 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.106056 2572 factory.go:153] Registering CRI-O factory Apr 22 18:46:22.106093 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.106084 2572 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:22.106154 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.106137 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:22.106206 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.106180 2572 factory.go:103] Registering Raw factory Apr 22 18:46:22.106206 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.106200 2572 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:22.106647 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.106632 2572 manager.go:319] Starting recovery of all containers Apr 22 18:46:22.108073 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.108047 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:22.110385 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.110360 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-k78qb" Apr 22 18:46:22.113439 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.113305 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-84.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:46:22.113563 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.113540 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:46:22.115772 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.115753 2572 manager.go:324] Recovery completed Apr 22 18:46:22.117511 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.117492 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 18:46:22.119205 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.119190 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-k78qb" Apr 22 18:46:22.120506 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.120493 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:22.123216 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.123201 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:22.123267 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.123239 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:22.123267 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.123251 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:22.123848 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.123832 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:22.123848 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.123847 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:22.124012 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.123862 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:22.125938 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.125821 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-84.ec2.internal.18a8c2341fad92b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-84.ec2.internal,UID:ip-10-0-138-84.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-84.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-84.ec2.internal,},FirstTimestamp:2026-04-22 18:46:22.123217592 +0000 UTC m=+0.461203441,LastTimestamp:2026-04-22 18:46:22.123217592 +0000 UTC m=+0.461203441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-84.ec2.internal,}" Apr 22 18:46:22.128162 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.128146 2572 policy_none.go:49] "None policy: Start" Apr 22 18:46:22.128236 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.128167 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:22.128236 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.128180 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.167500 2572 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.167534 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.167547 2572 server.go:85] "Starting device plugin registration server" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.167831 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.167844 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.167944 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.168041 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.168051 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.168630 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:22.186604 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.168671 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.198779 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.198742 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:22.200152 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.200133 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:22.200266 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.200159 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:22.200266 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.200180 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:22.200266 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.200185 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:22.200266 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.200231 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:22.203399 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.203377 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:22.268794 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.268722 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:22.269844 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.269828 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:22.269938 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.269862 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:22.269938 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.269872 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:22.269938 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.269911 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.278680 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.278661 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.278680 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.278682 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-84.ec2.internal\": node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.294069 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.294048 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.301024 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.300990 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal"] Apr 22 18:46:22.301102 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.301060 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:22.302171 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.302157 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:22.302243 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.302182 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:22.302243 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.302192 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:22.304429 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.304418 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:22.304583 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.304570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.304618 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.304600 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:22.305143 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.305128 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:22.305143 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.305137 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:22.305238 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.305157 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:22.305238 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.305160 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:22.305238 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.305169 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:22.305238 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.305170 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:22.306690 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.306671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ccd50016e9bd4e3cb2e95f7158c6eed-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal\" (UID: \"9ccd50016e9bd4e3cb2e95f7158c6eed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.306769 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.306703 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ccd50016e9bd4e3cb2e95f7158c6eed-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal\" (UID: \"9ccd50016e9bd4e3cb2e95f7158c6eed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.306769 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.306729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/44e2894ddf571d488225d543b36d7bb8-config\") pod \"kube-apiserver-proxy-ip-10-0-138-84.ec2.internal\" (UID: \"44e2894ddf571d488225d543b36d7bb8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.307310 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.307298 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.307358 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.307321 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:22.308065 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.308044 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:22.308065 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.308068 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:22.308164 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.308079 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:22.323436 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.323419 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-84.ec2.internal\" not found" node="ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.327761 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.327745 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-84.ec2.internal\" not found" node="ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.394883 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.394856 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.407154 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.407132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ccd50016e9bd4e3cb2e95f7158c6eed-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal\" (UID: \"9ccd50016e9bd4e3cb2e95f7158c6eed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.407228 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.407159 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ccd50016e9bd4e3cb2e95f7158c6eed-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal\" (UID: \"9ccd50016e9bd4e3cb2e95f7158c6eed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.407228 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.407175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/44e2894ddf571d488225d543b36d7bb8-config\") pod \"kube-apiserver-proxy-ip-10-0-138-84.ec2.internal\" (UID: \"44e2894ddf571d488225d543b36d7bb8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.407228 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.407207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/44e2894ddf571d488225d543b36d7bb8-config\") pod \"kube-apiserver-proxy-ip-10-0-138-84.ec2.internal\" (UID: \"44e2894ddf571d488225d543b36d7bb8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.407342 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.407232 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ccd50016e9bd4e3cb2e95f7158c6eed-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal\" (UID: \"9ccd50016e9bd4e3cb2e95f7158c6eed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.407342 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.407244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ccd50016e9bd4e3cb2e95f7158c6eed-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal\" (UID: \"9ccd50016e9bd4e3cb2e95f7158c6eed\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.495699 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.495655 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.596345 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.596309 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.624775 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.624754 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.630231 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:22.630206 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" Apr 22 18:46:22.696852 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.696801 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.797296 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.797267 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.897867 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.897795 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:22.998401 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:22.998374 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:23.010976 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.010949 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:23.011086 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.011034 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:23.011086 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.011075 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:23.098818 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:23.098787 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:23.104955 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.104936 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:23.115450 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.115423 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:23.120833 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.120803 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:22 +0000 UTC" deadline="2027-11-27 13:41:32.98764451 +0000 UTC" Apr 22 18:46:23.120924 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.120833 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14010h55m9.866814783s" Apr 22 18:46:23.132572 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.132552 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vl8sr" Apr 22 18:46:23.143533 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.143515 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vl8sr" Apr 22 18:46:23.167716 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:23.167485 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44e2894ddf571d488225d543b36d7bb8.slice/crio-87ac63fba5bb3e956bf7c0b9e2a3956666eb4657598f7466c9d616e449e72738 WatchSource:0}: Error finding container 87ac63fba5bb3e956bf7c0b9e2a3956666eb4657598f7466c9d616e449e72738: Status 404 returned error can't find the container with id 87ac63fba5bb3e956bf7c0b9e2a3956666eb4657598f7466c9d616e449e72738 Apr 22 18:46:23.173483 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.173469 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:23.199706 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:23.199675 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:23.203584 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.203537 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" event={"ID":"9ccd50016e9bd4e3cb2e95f7158c6eed","Type":"ContainerStarted","Data":"d9e5abb24c8466031d104273253a948960bb127a09fa962197eddbfdab1172da"} Apr 22 18:46:23.204459 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.204434 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" event={"ID":"44e2894ddf571d488225d543b36d7bb8","Type":"ContainerStarted","Data":"87ac63fba5bb3e956bf7c0b9e2a3956666eb4657598f7466c9d616e449e72738"} Apr 22 18:46:23.248817 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.248798 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:23.300335 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:23.300306 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:23.400910 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:23.400871 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:23.501372 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:23.501310 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-84.ec2.internal\" not found" Apr 22 18:46:23.595023 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.595001 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:23.605566 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.605549 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" Apr 22 18:46:23.617565 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.617537 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:23.618622 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.618605 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" Apr 22 18:46:23.624493 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:23.624478 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:24.084398 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.084367 2572 apiserver.go:52] "Watching apiserver" Apr 22 18:46:24.089623 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.089601 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:24.090512 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.090487 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-q65sk","openshift-image-registry/node-ca-x6px5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal","openshift-multus/network-metrics-daemon-8xjpc","openshift-network-operator/iptables-alerter-lvpr2","openshift-ovn-kubernetes/ovnkube-node-vtpk8","kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd","openshift-multus/multus-additional-cni-plugins-fsns5","openshift-multus/multus-nnfk2","openshift-network-diagnostics/network-check-target-h9t7j","kube-system/konnectivity-agent-2gt98","openshift-cluster-node-tuning-operator/tuned-9q6ht"] Apr 22 18:46:24.093250 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.093211 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.095337 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.095313 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:24.095432 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.095413 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:24.095521 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.095503 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qqhr8\"" Apr 22 18:46:24.095521 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.095517 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:24.095631 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.095611 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:24.097586 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.097557 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:24.097688 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.097631 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:24.100093 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.100073 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.100213 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.100197 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.102438 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.102416 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:24.102530 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.102514 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:24.102596 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.102538 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:24.102647 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.102636 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-cgt5g\"" Apr 22 18:46:24.103032 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.102782 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jr8wm\"" Apr 22 18:46:24.103032 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.102830 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:24.103032 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.102851 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:24.103223 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.103143 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:24.105214 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.105190 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.105893 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.105874 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.107946 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.107814 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:24.107946 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.107828 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:24.108151 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.108135 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ql9v8\"" Apr 22 18:46:24.108453 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.108436 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xtggb\"" Apr 22 18:46:24.108840 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.108823 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:24.110265 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.109638 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:24.110265 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.109643 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.110400 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.110367 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:24.110455 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.110402 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:24.110508 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.110448 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:24.110606 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.110559 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:24.111087 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.110908 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:24.112209 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.112186 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:24.112503 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.112487 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:24.112913 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.112883 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kxtjg\"" Apr 22 18:46:24.113746 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.113732 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.115622 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.115605 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8kt4h\"" Apr 22 18:46:24.115716 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.115639 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:24.115816 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.115622 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:24.116024 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116006 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:24.116119 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.116100 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:24.116410 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-cni-multus\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.116504 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116444 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:24.116504 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.116504 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116499 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.116656 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-socket-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.116656 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-sys-fs\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.116656 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-k8s-cni-cncf-io\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.116656 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-hostroot\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.116844 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116678 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-conf-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.116844 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.116844 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116742 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.116844 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116769 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-cni-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.116844 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-netns\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117094 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116882 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-multus-certs\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117094 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-cni-bin\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.117094 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.116983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znm5v\" (UniqueName: \"kubernetes.io/projected/53d298d6-4725-419c-b9f4-0f58a63b1715-kube-api-access-znm5v\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.117094 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-cni-bin\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117094 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/26feee8a-9b45-4708-a356-fcabada1a28c-multus-daemon-config\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117094 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm576\" (UniqueName: \"kubernetes.io/projected/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-kube-api-access-pm576\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117099 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117125 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-socket-dir-parent\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117151 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-cni-netd\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovnkube-config\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117194 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53d298d6-4725-419c-b9f4-0f58a63b1715-serviceca\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cnibin\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117261 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpngr\" (UniqueName: \"kubernetes.io/projected/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-kube-api-access-kpngr\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117290 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9x6h\" (UniqueName: \"kubernetes.io/projected/e28dd910-549e-488c-8e99-3ad3f1d11a5e-kube-api-access-w9x6h\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:24.117368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117316 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-host-slash\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-device-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-os-release\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117434 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-etc-kubernetes\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117450 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftvp\" (UniqueName: \"kubernetes.io/projected/26feee8a-9b45-4708-a356-fcabada1a28c-kube-api-access-gftvp\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-iptables-alerter-script\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-systemd\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovn-node-metrics-cert\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-system-cni-dir\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117583 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-etc-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovnkube-script-lib\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74nf\" (UniqueName: \"kubernetes.io/projected/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-kube-api-access-r74nf\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117675 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-registration-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117719 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26feee8a-9b45-4708-a356-fcabada1a28c-cni-binary-copy\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.117786 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-kubelet\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-slash\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-var-lib-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-ovn\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117877 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svm8\" (UniqueName: \"kubernetes.io/projected/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-kube-api-access-9svm8\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117918 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-os-release\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-log-socket\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.117985 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-env-overrides\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.118028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-kubelet\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.118060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-systemd-units\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.118084 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-run-netns\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.118109 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-system-cni-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.118145 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-node-log\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.118184 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.118219 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-cnibin\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.118424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.118241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53d298d6-4725-419c-b9f4-0f58a63b1715-host\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.120749 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.120731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.121074 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.121058 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:24.122764 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.122742 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:24.123140 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.123120 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:24.123378 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.123359 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rvxv5\"" Apr 22 18:46:24.123460 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.123429 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:24.123517 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.123360 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v9xh8\"" Apr 22 18:46:24.123630 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.123616 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:24.144204 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.144174 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:23 +0000 UTC" deadline="2027-09-20 21:24:20.845083592 +0000 UTC" Apr 22 18:46:24.144204 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.144197 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12386h37m56.700888902s" Apr 22 18:46:24.206939 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.206916 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:24.219233 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-systemd-units\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.219353 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-run-netns\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.219353 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219274 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9cc1681f-a720-40b5-a0fa-1d414f3f4906-konnectivity-ca\") pod \"konnectivity-agent-2gt98\" (UID: \"9cc1681f-a720-40b5-a0fa-1d414f3f4906\") " pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:24.219353 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-systemd-units\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.219518 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219349 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-lib-modules\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.219518 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219386 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g882j\" (UniqueName: \"kubernetes.io/projected/07979851-c8ab-4500-998e-e7498964b0a7-kube-api-access-g882j\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.219518 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-system-cni-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.219518 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysctl-conf\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.219518 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219471 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-var-lib-kubelet\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.219518 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-run-netns\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.219518 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-node-log\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-system-cni-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219545 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-node-log\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-run\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-cnibin\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53d298d6-4725-419c-b9f4-0f58a63b1715-host\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-etc-selinux\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-sys\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219701 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-cni-multus\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53d298d6-4725-419c-b9f4-0f58a63b1715-host\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219710 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-cnibin\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219726 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219751 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-cni-multus\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219788 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-run-ovn-kubernetes\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.219849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-socket-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-sys-fs\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-hosts-file\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.219876 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-k8s-cni-cncf-io\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-k8s-cni-cncf-io\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.219971 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.719939537 +0000 UTC m=+3.057925360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-sys-fs\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-socket-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.219990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-hostroot\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-conf-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-conf-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220072 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-hostroot\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.220696 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220087 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220102 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwpgh\" (UniqueName: \"kubernetes.io/projected/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-kube-api-access-lwpgh\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-modprobe-d\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-host\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-cni-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-netns\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-multus-certs\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220263 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-cni-bin\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znm5v\" (UniqueName: \"kubernetes.io/projected/53d298d6-4725-419c-b9f4-0f58a63b1715-kube-api-access-znm5v\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220316 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysctl-d\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220330 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-multus-certs\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-systemd\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220339 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-cni-dir\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-run-netns\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220387 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-cni-bin\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220453 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07979851-c8ab-4500-998e-e7498964b0a7-etc-tuned\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-cni-bin\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.221479 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220538 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/26feee8a-9b45-4708-a356-fcabada1a28c-multus-daemon-config\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm576\" (UniqueName: \"kubernetes.io/projected/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-kube-api-access-pm576\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220579 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220609 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-cni-bin\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220676 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9cc1681f-a720-40b5-a0fa-1d414f3f4906-agent-certs\") pod \"konnectivity-agent-2gt98\" (UID: \"9cc1681f-a720-40b5-a0fa-1d414f3f4906\") " pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-socket-dir-parent\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-cni-netd\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220774 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovnkube-config\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53d298d6-4725-419c-b9f4-0f58a63b1715-serviceca\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-cni-netd\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220827 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07979851-c8ab-4500-998e-e7498964b0a7-tmp\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-multus-socket-dir-parent\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cnibin\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpngr\" (UniqueName: \"kubernetes.io/projected/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-kube-api-access-kpngr\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cnibin\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.222195 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.220971 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9x6h\" (UniqueName: \"kubernetes.io/projected/e28dd910-549e-488c-8e99-3ad3f1d11a5e-kube-api-access-w9x6h\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-host-slash\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221128 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-host-slash\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-device-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-device-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221137 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/26feee8a-9b45-4708-a356-fcabada1a28c-multus-daemon-config\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-os-release\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221212 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53d298d6-4725-419c-b9f4-0f58a63b1715-serviceca\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-etc-kubernetes\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gftvp\" (UniqueName: \"kubernetes.io/projected/26feee8a-9b45-4708-a356-fcabada1a28c-kube-api-access-gftvp\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-iptables-alerter-script\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-etc-kubernetes\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-systemd\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221346 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovnkube-config\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovn-node-metrics-cert\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221357 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-systemd\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221386 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-system-cni-dir\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.222947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-os-release\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-etc-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-system-cni-dir\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovnkube-script-lib\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221492 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-etc-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221514 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r74nf\" (UniqueName: \"kubernetes.io/projected/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-kube-api-access-r74nf\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-registration-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221573 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-tmp-dir\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26feee8a-9b45-4708-a356-fcabada1a28c-cni-binary-copy\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-kubelet\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-slash\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-var-lib-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-ovn\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9svm8\" (UniqueName: \"kubernetes.io/projected/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-kube-api-access-9svm8\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221670 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-os-release\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.223551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221740 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-iptables-alerter-script\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-log-socket\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-env-overrides\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221843 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-var-lib-openvswitch\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysconfig\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-kubernetes\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-kubelet\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221940 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovnkube-script-lib\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222070 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-kubelet\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-run-ovn\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-log-socket\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26feee8a-9b45-4708-a356-fcabada1a28c-host-var-lib-kubelet\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-host-slash\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222161 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-os-release\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.221695 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-registration-dir\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.224220 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222359 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-env-overrides\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.224877 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cni-binary-copy\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.224877 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222428 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.224877 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.222704 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/26feee8a-9b45-4708-a356-fcabada1a28c-cni-binary-copy\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.225229 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.225209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-ovn-node-metrics-cert\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.228266 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.228241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znm5v\" (UniqueName: \"kubernetes.io/projected/53d298d6-4725-419c-b9f4-0f58a63b1715-kube-api-access-znm5v\") pod \"node-ca-x6px5\" (UID: \"53d298d6-4725-419c-b9f4-0f58a63b1715\") " pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.228872 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.228785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm576\" (UniqueName: \"kubernetes.io/projected/28f9f4d1-15dc-40be-a8db-e7a35cb819c1-kube-api-access-pm576\") pod \"iptables-alerter-lvpr2\" (UID: \"28f9f4d1-15dc-40be-a8db-e7a35cb819c1\") " pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.229235 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.229210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpngr\" (UniqueName: \"kubernetes.io/projected/fc61b10e-6ece-4657-9ee5-9f863cd9a3d9-kube-api-access-kpngr\") pod \"aws-ebs-csi-driver-node-8xvnd\" (UID: \"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.229629 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.229607 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftvp\" (UniqueName: \"kubernetes.io/projected/26feee8a-9b45-4708-a356-fcabada1a28c-kube-api-access-gftvp\") pod \"multus-nnfk2\" (UID: \"26feee8a-9b45-4708-a356-fcabada1a28c\") " pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.230244 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.230215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74nf\" (UniqueName: \"kubernetes.io/projected/bcd75ef0-f7af-4a32-b19b-aa29b44cd391-kube-api-access-r74nf\") pod \"multus-additional-cni-plugins-fsns5\" (UID: \"bcd75ef0-f7af-4a32-b19b-aa29b44cd391\") " pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.230820 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.230791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svm8\" (UniqueName: \"kubernetes.io/projected/1c76f00a-74ae-463c-9b29-4b39f9d6a26d-kube-api-access-9svm8\") pod \"ovnkube-node-vtpk8\" (UID: \"1c76f00a-74ae-463c-9b29-4b39f9d6a26d\") " pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.231246 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.231215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9x6h\" (UniqueName: \"kubernetes.io/projected/e28dd910-549e-488c-8e99-3ad3f1d11a5e-kube-api-access-w9x6h\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:24.322807 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-run\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.322807 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322829 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-sys\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-hosts-file\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-run\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwpgh\" (UniqueName: \"kubernetes.io/projected/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-kube-api-access-lwpgh\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322930 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-modprobe-d\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-host\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.322964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-sys\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysctl-d\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323033 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-hosts-file\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-systemd\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-modprobe-d\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07979851-c8ab-4500-998e-e7498964b0a7-etc-tuned\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323094 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-host\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-systemd\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323104 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9cc1681f-a720-40b5-a0fa-1d414f3f4906-agent-certs\") pod \"konnectivity-agent-2gt98\" (UID: \"9cc1681f-a720-40b5-a0fa-1d414f3f4906\") " pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07979851-c8ab-4500-998e-e7498964b0a7-tmp\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323169 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysctl-d\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-tmp-dir\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysconfig\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-kubernetes\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9cc1681f-a720-40b5-a0fa-1d414f3f4906-konnectivity-ca\") pod \"konnectivity-agent-2gt98\" (UID: \"9cc1681f-a720-40b5-a0fa-1d414f3f4906\") " pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-lib-modules\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysconfig\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g882j\" (UniqueName: \"kubernetes.io/projected/07979851-c8ab-4500-998e-e7498964b0a7-kube-api-access-g882j\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysctl-conf\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-var-lib-kubelet\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.323451 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323452 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-var-lib-kubelet\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.324172 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323516 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-kubernetes\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.324172 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-tmp-dir\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.324172 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323550 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-lib-modules\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.324172 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07979851-c8ab-4500-998e-e7498964b0a7-etc-sysctl-conf\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.324172 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.323847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9cc1681f-a720-40b5-a0fa-1d414f3f4906-konnectivity-ca\") pod \"konnectivity-agent-2gt98\" (UID: \"9cc1681f-a720-40b5-a0fa-1d414f3f4906\") " pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:24.325219 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.325198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07979851-c8ab-4500-998e-e7498964b0a7-tmp\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.325334 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.325260 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07979851-c8ab-4500-998e-e7498964b0a7-etc-tuned\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.325745 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.325728 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9cc1681f-a720-40b5-a0fa-1d414f3f4906-agent-certs\") pod \"konnectivity-agent-2gt98\" (UID: \"9cc1681f-a720-40b5-a0fa-1d414f3f4906\") " pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:24.340448 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.340392 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:24.340448 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.340421 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:24.340448 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.340434 2572 projected.go:194] Error preparing data for projected volume kube-api-access-hn9ch for pod openshift-network-diagnostics/network-check-target-h9t7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:24.340611 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.340526 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch podName:64bb8453-8f86-4f40-ab06-b6f7eb42265e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:24.84050601 +0000 UTC m=+3.178491828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hn9ch" (UniqueName: "kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch") pod "network-check-target-h9t7j" (UID: "64bb8453-8f86-4f40-ab06-b6f7eb42265e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:24.342252 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.342234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g882j\" (UniqueName: \"kubernetes.io/projected/07979851-c8ab-4500-998e-e7498964b0a7-kube-api-access-g882j\") pod \"tuned-9q6ht\" (UID: \"07979851-c8ab-4500-998e-e7498964b0a7\") " pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.342484 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.342464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwpgh\" (UniqueName: \"kubernetes.io/projected/86c95b36-8aa3-4c99-b0b6-3746cf836c8c-kube-api-access-lwpgh\") pod \"node-resolver-q65sk\" (UID: \"86c95b36-8aa3-4c99-b0b6-3746cf836c8c\") " pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.406556 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.406520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nnfk2" Apr 22 18:46:24.417192 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.417162 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x6px5" Apr 22 18:46:24.419018 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.418996 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:24.426972 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.426948 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lvpr2" Apr 22 18:46:24.433565 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.433541 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:24.442223 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.442205 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" Apr 22 18:46:24.449759 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.449741 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fsns5" Apr 22 18:46:24.457370 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.457354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q65sk" Apr 22 18:46:24.462962 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.462943 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" Apr 22 18:46:24.469495 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.469480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:24.726223 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.726137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:24.726363 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.726235 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:24.726363 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.726297 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:25.726281942 +0000 UTC m=+4.064267765 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:24.886351 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.886321 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc61b10e_6ece_4657_9ee5_9f863cd9a3d9.slice/crio-cf8550927cc73f6611aa20d51bcfc4d3deed07df4c2b6f638960d0e34bb07783 WatchSource:0}: Error finding container cf8550927cc73f6611aa20d51bcfc4d3deed07df4c2b6f638960d0e34bb07783: Status 404 returned error can't find the container with id cf8550927cc73f6611aa20d51bcfc4d3deed07df4c2b6f638960d0e34bb07783 Apr 22 18:46:24.889996 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.887742 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28f9f4d1_15dc_40be_a8db_e7a35cb819c1.slice/crio-969aac772dfc43c17ceef04d62dbb3d10e1c59abb24140a52c55c1de723f566b WatchSource:0}: Error finding container 969aac772dfc43c17ceef04d62dbb3d10e1c59abb24140a52c55c1de723f566b: Status 404 returned error can't find the container with id 969aac772dfc43c17ceef04d62dbb3d10e1c59abb24140a52c55c1de723f566b Apr 22 18:46:24.891167 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.891090 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c76f00a_74ae_463c_9b29_4b39f9d6a26d.slice/crio-6678f50e24636d1d84af3d1cd8fe222e3ae4b26cf2d292aa94a003029c93c086 WatchSource:0}: Error finding container 6678f50e24636d1d84af3d1cd8fe222e3ae4b26cf2d292aa94a003029c93c086: Status 404 returned error can't find the container with id 6678f50e24636d1d84af3d1cd8fe222e3ae4b26cf2d292aa94a003029c93c086 Apr 22 18:46:24.894447 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.894423 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d298d6_4725_419c_b9f4_0f58a63b1715.slice/crio-e6fd82f913010bf5f485e7e1b4675a06f53ed34d0eb4e2f36c566e95573bac6e WatchSource:0}: Error finding container e6fd82f913010bf5f485e7e1b4675a06f53ed34d0eb4e2f36c566e95573bac6e: Status 404 returned error can't find the container with id e6fd82f913010bf5f485e7e1b4675a06f53ed34d0eb4e2f36c566e95573bac6e Apr 22 18:46:24.895665 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.895645 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07979851_c8ab_4500_998e_e7498964b0a7.slice/crio-dffc5b2ef085f28c74f1e651e9a0b7aaebb71bf79783e8990028c22c678b1da5 WatchSource:0}: Error finding container dffc5b2ef085f28c74f1e651e9a0b7aaebb71bf79783e8990028c22c678b1da5: Status 404 returned error can't find the container with id dffc5b2ef085f28c74f1e651e9a0b7aaebb71bf79783e8990028c22c678b1da5 Apr 22 18:46:24.896735 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.896717 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcd75ef0_f7af_4a32_b19b_aa29b44cd391.slice/crio-639314391f652e5fffb6c061fe0e4470876c41db3654ceaa9bf0bed04b6b2c39 WatchSource:0}: Error finding container 639314391f652e5fffb6c061fe0e4470876c41db3654ceaa9bf0bed04b6b2c39: Status 404 returned error can't find the container with id 639314391f652e5fffb6c061fe0e4470876c41db3654ceaa9bf0bed04b6b2c39 Apr 22 18:46:24.898286 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.898183 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86c95b36_8aa3_4c99_b0b6_3746cf836c8c.slice/crio-0f72f688c77a5c7f4e530026ec0def395e55ce2c94bc848551f0f77abb00a7d1 WatchSource:0}: Error finding container 0f72f688c77a5c7f4e530026ec0def395e55ce2c94bc848551f0f77abb00a7d1: Status 404 returned error can't find the container with id 0f72f688c77a5c7f4e530026ec0def395e55ce2c94bc848551f0f77abb00a7d1 Apr 22 18:46:24.900665 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.900602 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc1681f_a720_40b5_a0fa_1d414f3f4906.slice/crio-dafb9ce3fb4068a74485e146370dbb67f142d571dc01bf0d0a1493a13c888125 WatchSource:0}: Error finding container dafb9ce3fb4068a74485e146370dbb67f142d571dc01bf0d0a1493a13c888125: Status 404 returned error can't find the container with id dafb9ce3fb4068a74485e146370dbb67f142d571dc01bf0d0a1493a13c888125 Apr 22 18:46:24.901005 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:24.900970 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26feee8a_9b45_4708_a356_fcabada1a28c.slice/crio-ec3d184c26a2ef11fec65f16539f045262c7f0a7f0eb16cca30da3c018755f5a WatchSource:0}: Error finding container ec3d184c26a2ef11fec65f16539f045262c7f0a7f0eb16cca30da3c018755f5a: Status 404 returned error can't find the container with id ec3d184c26a2ef11fec65f16539f045262c7f0a7f0eb16cca30da3c018755f5a Apr 22 18:46:24.927557 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:24.927402 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:24.927638 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.927545 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:24.927683 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.927654 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:24.927683 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.927671 2572 projected.go:194] Error preparing data for projected volume kube-api-access-hn9ch for pod openshift-network-diagnostics/network-check-target-h9t7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:24.927746 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:24.927732 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch podName:64bb8453-8f86-4f40-ab06-b6f7eb42265e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:25.927711026 +0000 UTC m=+4.265696859 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hn9ch" (UniqueName: "kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch") pod "network-check-target-h9t7j" (UID: "64bb8453-8f86-4f40-ab06-b6f7eb42265e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:25.145291 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.145251 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:23 +0000 UTC" deadline="2027-12-24 16:56:18.29006348 +0000 UTC" Apr 22 18:46:25.145291 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.145287 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14662h9m53.144779844s" Apr 22 18:46:25.208606 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.208567 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q65sk" event={"ID":"86c95b36-8aa3-4c99-b0b6-3746cf836c8c","Type":"ContainerStarted","Data":"0f72f688c77a5c7f4e530026ec0def395e55ce2c94bc848551f0f77abb00a7d1"} Apr 22 18:46:25.209838 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.209800 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" event={"ID":"07979851-c8ab-4500-998e-e7498964b0a7","Type":"ContainerStarted","Data":"dffc5b2ef085f28c74f1e651e9a0b7aaebb71bf79783e8990028c22c678b1da5"} Apr 22 18:46:25.210956 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.210926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x6px5" event={"ID":"53d298d6-4725-419c-b9f4-0f58a63b1715","Type":"ContainerStarted","Data":"e6fd82f913010bf5f485e7e1b4675a06f53ed34d0eb4e2f36c566e95573bac6e"} Apr 22 18:46:25.212330 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.212298 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"6678f50e24636d1d84af3d1cd8fe222e3ae4b26cf2d292aa94a003029c93c086"} Apr 22 18:46:25.213476 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.213458 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lvpr2" event={"ID":"28f9f4d1-15dc-40be-a8db-e7a35cb819c1","Type":"ContainerStarted","Data":"969aac772dfc43c17ceef04d62dbb3d10e1c59abb24140a52c55c1de723f566b"} Apr 22 18:46:25.217699 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.217653 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" event={"ID":"44e2894ddf571d488225d543b36d7bb8","Type":"ContainerStarted","Data":"03a0cbae692e315a5b70f2c3c296686c98120fc6987f55d73fd28ef23c2af68a"} Apr 22 18:46:25.218944 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.218922 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nnfk2" event={"ID":"26feee8a-9b45-4708-a356-fcabada1a28c","Type":"ContainerStarted","Data":"ec3d184c26a2ef11fec65f16539f045262c7f0a7f0eb16cca30da3c018755f5a"} Apr 22 18:46:25.219878 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.219858 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2gt98" event={"ID":"9cc1681f-a720-40b5-a0fa-1d414f3f4906","Type":"ContainerStarted","Data":"dafb9ce3fb4068a74485e146370dbb67f142d571dc01bf0d0a1493a13c888125"} Apr 22 18:46:25.221424 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.221402 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerStarted","Data":"639314391f652e5fffb6c061fe0e4470876c41db3654ceaa9bf0bed04b6b2c39"} Apr 22 18:46:25.222644 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.222622 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" event={"ID":"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9","Type":"ContainerStarted","Data":"cf8550927cc73f6611aa20d51bcfc4d3deed07df4c2b6f638960d0e34bb07783"} Apr 22 18:46:25.229916 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.229856 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-84.ec2.internal" podStartSLOduration=2.229841815 podStartE2EDuration="2.229841815s" podCreationTimestamp="2026-04-22 18:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:25.229577097 +0000 UTC m=+3.567562939" watchObservedRunningTime="2026-04-22 18:46:25.229841815 +0000 UTC m=+3.567827657" Apr 22 18:46:25.735489 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.734874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:25.735489 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:25.735084 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:25.735489 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:25.735147 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:27.735126338 +0000 UTC m=+6.073112159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:25.937758 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:25.937102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:25.937758 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:25.937285 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:25.937758 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:25.937302 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:25.937758 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:25.937317 2572 projected.go:194] Error preparing data for projected volume kube-api-access-hn9ch for pod openshift-network-diagnostics/network-check-target-h9t7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:25.937758 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:25.937374 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch podName:64bb8453-8f86-4f40-ab06-b6f7eb42265e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:27.937355875 +0000 UTC m=+6.275341709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hn9ch" (UniqueName: "kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch") pod "network-check-target-h9t7j" (UID: "64bb8453-8f86-4f40-ab06-b6f7eb42265e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:26.204161 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:26.203409 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:26.204161 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:26.203521 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:26.204161 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:26.203968 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:26.204161 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:26.204066 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:26.238575 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:26.238546 2572 generic.go:358] "Generic (PLEG): container finished" podID="9ccd50016e9bd4e3cb2e95f7158c6eed" containerID="d121507370e38366f2fde305db3aa28c56cff0bb94f884bdd06af0b796192764" exitCode=0 Apr 22 18:46:26.238746 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:26.238612 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" event={"ID":"9ccd50016e9bd4e3cb2e95f7158c6eed","Type":"ContainerDied","Data":"d121507370e38366f2fde305db3aa28c56cff0bb94f884bdd06af0b796192764"} Apr 22 18:46:27.247462 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:27.246450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" event={"ID":"9ccd50016e9bd4e3cb2e95f7158c6eed","Type":"ContainerStarted","Data":"99161f597ae09b966009be30eb0e5e983c29c780347605971631f4a0fd605841"} Apr 22 18:46:27.751427 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:27.751391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:27.751606 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:27.751566 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:27.751669 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:27.751642 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:31.7516212 +0000 UTC m=+10.089607025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:27.953092 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:27.952666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:27.953092 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:27.952820 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:27.953092 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:27.952841 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:27.953092 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:27.952853 2572 projected.go:194] Error preparing data for projected volume kube-api-access-hn9ch for pod openshift-network-diagnostics/network-check-target-h9t7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:27.953092 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:27.952928 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch podName:64bb8453-8f86-4f40-ab06-b6f7eb42265e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:31.952892198 +0000 UTC m=+10.290878020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hn9ch" (UniqueName: "kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch") pod "network-check-target-h9t7j" (UID: "64bb8453-8f86-4f40-ab06-b6f7eb42265e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:28.201221 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:28.201183 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:28.201382 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:28.201318 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:28.201382 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:28.201338 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:28.201487 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:28.201409 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:30.201182 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:30.201154 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:30.201622 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:30.201192 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:30.201622 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:30.201282 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:30.201622 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:30.201401 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:31.780217 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:31.780150 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:31.780678 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:31.780258 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:31.780678 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:31.780330 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:39.780308034 +0000 UTC m=+18.118293866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:31.981643 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:31.981606 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:31.981821 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:31.981796 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:31.981884 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:31.981825 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:31.981884 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:31.981840 2572 projected.go:194] Error preparing data for projected volume kube-api-access-hn9ch for pod openshift-network-diagnostics/network-check-target-h9t7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:31.982018 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:31.981920 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch podName:64bb8453-8f86-4f40-ab06-b6f7eb42265e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:39.981883767 +0000 UTC m=+18.319869609 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hn9ch" (UniqueName: "kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch") pod "network-check-target-h9t7j" (UID: "64bb8453-8f86-4f40-ab06-b6f7eb42265e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:32.202494 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:32.202122 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:32.202494 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:32.202225 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:32.202494 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:32.202284 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:32.202494 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:32.202365 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:34.200850 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.200642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:34.201334 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.200722 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:34.201334 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:34.200989 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:34.201334 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:34.201033 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:34.258600 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.258563 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q65sk" event={"ID":"86c95b36-8aa3-4c99-b0b6-3746cf836c8c","Type":"ContainerStarted","Data":"8a026f3e926d6a45888d5a6f782212dd07ab0971998a4c028c461f1789197b1e"} Apr 22 18:46:34.260001 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.259966 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" event={"ID":"07979851-c8ab-4500-998e-e7498964b0a7","Type":"ContainerStarted","Data":"c41010bbae30d3ab590eab49a910a335c4ca3a52ce43d46bb4fbbd80b9b094e3"} Apr 22 18:46:34.261380 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.261354 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x6px5" event={"ID":"53d298d6-4725-419c-b9f4-0f58a63b1715","Type":"ContainerStarted","Data":"e284140a0bbb4c4869cfe2a10d2fbb449533e90f3d478f962a061d6792adaa0e"} Apr 22 18:46:34.262808 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.262777 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2gt98" event={"ID":"9cc1681f-a720-40b5-a0fa-1d414f3f4906","Type":"ContainerStarted","Data":"aa367482ee4451ca99fa512fcff27d67be11566daa52be1d121e8939651d90ce"} Apr 22 18:46:34.264242 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.264220 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerStarted","Data":"42a1e1a8ada409a3878121e55d86f6d4aeacd703e5bab815f4c282fb9c3b9122"} Apr 22 18:46:34.265656 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.265635 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" event={"ID":"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9","Type":"ContainerStarted","Data":"3aa390e58d3de7f4513726d283cb98bfb57618477ed0adceabc58d5718d2826c"} Apr 22 18:46:34.270054 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.270009 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-84.ec2.internal" podStartSLOduration=11.269994905 podStartE2EDuration="11.269994905s" podCreationTimestamp="2026-04-22 18:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:27.25861056 +0000 UTC m=+5.596596404" watchObservedRunningTime="2026-04-22 18:46:34.269994905 +0000 UTC m=+12.607980747" Apr 22 18:46:34.270936 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.270880 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q65sk" podStartSLOduration=3.882734226 podStartE2EDuration="12.270872067s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.900842092 +0000 UTC m=+3.238827916" lastFinishedPulling="2026-04-22 18:46:33.288979935 +0000 UTC m=+11.626965757" observedRunningTime="2026-04-22 18:46:34.269818376 +0000 UTC m=+12.607804220" watchObservedRunningTime="2026-04-22 18:46:34.270872067 +0000 UTC m=+12.608857912" Apr 22 18:46:34.296587 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.296543 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9q6ht" podStartSLOduration=3.890615651 podStartE2EDuration="12.296529032s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.898650008 +0000 UTC m=+3.236635840" lastFinishedPulling="2026-04-22 18:46:33.3045634 +0000 UTC m=+11.642549221" observedRunningTime="2026-04-22 18:46:34.296005775 +0000 UTC m=+12.633991617" watchObservedRunningTime="2026-04-22 18:46:34.296529032 +0000 UTC m=+12.634514890" Apr 22 18:46:34.324065 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.324012 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2gt98" podStartSLOduration=3.939760527 podStartE2EDuration="12.323995915s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.902725533 +0000 UTC m=+3.240711352" lastFinishedPulling="2026-04-22 18:46:33.286960921 +0000 UTC m=+11.624946740" observedRunningTime="2026-04-22 18:46:34.307288659 +0000 UTC m=+12.645274501" watchObservedRunningTime="2026-04-22 18:46:34.323995915 +0000 UTC m=+12.661981759" Apr 22 18:46:34.324280 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:34.324190 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x6px5" podStartSLOduration=3.930634919 podStartE2EDuration="12.324184331s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.896332499 +0000 UTC m=+3.234318333" lastFinishedPulling="2026-04-22 18:46:33.289881926 +0000 UTC m=+11.627867745" observedRunningTime="2026-04-22 18:46:34.323785029 +0000 UTC m=+12.661770869" watchObservedRunningTime="2026-04-22 18:46:34.324184331 +0000 UTC m=+12.662170171" Apr 22 18:46:35.268791 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:35.268719 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lvpr2" event={"ID":"28f9f4d1-15dc-40be-a8db-e7a35cb819c1","Type":"ContainerStarted","Data":"b50dbca072a90a1ec49082faae6c3366c10cee025fbb816038ae1fecd5ee46c4"} Apr 22 18:46:35.279853 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:35.279809 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lvpr2" podStartSLOduration=4.884965553 podStartE2EDuration="13.279797154s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.891929412 +0000 UTC m=+3.229915231" lastFinishedPulling="2026-04-22 18:46:33.286760999 +0000 UTC m=+11.624746832" observedRunningTime="2026-04-22 18:46:35.279530795 +0000 UTC m=+13.617516636" watchObservedRunningTime="2026-04-22 18:46:35.279797154 +0000 UTC m=+13.617782996" Apr 22 18:46:36.201269 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:36.201232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:36.201436 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:36.201287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:36.201436 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:36.201380 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:36.201685 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:36.201640 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:38.201364 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:38.201333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:38.202020 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:38.201447 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:38.202020 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:38.201507 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:38.202020 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:38.201618 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:38.275664 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:38.275587 2572 generic.go:358] "Generic (PLEG): container finished" podID="bcd75ef0-f7af-4a32-b19b-aa29b44cd391" containerID="42a1e1a8ada409a3878121e55d86f6d4aeacd703e5bab815f4c282fb9c3b9122" exitCode=0 Apr 22 18:46:38.275664 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:38.275631 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerDied","Data":"42a1e1a8ada409a3878121e55d86f6d4aeacd703e5bab815f4c282fb9c3b9122"} Apr 22 18:46:38.761917 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:38.761634 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:46:39.137253 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.136505 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:39.137418 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.137288 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:39.178126 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.177969 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:46:38.761784888Z","UUID":"df4818f1-64de-4e17-b27e-06a0e39c74de","Handler":null,"Name":"","Endpoint":""} Apr 22 18:46:39.180944 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.180920 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:46:39.180944 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.180948 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:46:39.279732 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.279696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" event={"ID":"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9","Type":"ContainerStarted","Data":"4d9ae32f07c470c8ab33459ab981068ecd704333982f42a10e1e95004d93e1f4"} Apr 22 18:46:39.283520 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.283495 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"f842497b6fa643d158ff49aeb0d31d3b43d7754c205de4081ff22dcd7eefd94c"} Apr 22 18:46:39.283636 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.283528 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"8e08f9182e46039d513bcb1d782399935aa1a5f97d86a699d08a601d6247831d"} Apr 22 18:46:39.283636 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.283543 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"5d766ea979a17487a1c9e5ebf9702fcdcc7fac22226d55a75aaa7aa9692c13ce"} Apr 22 18:46:39.283636 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.283553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"6e72d5fc08297edce4793054093de7053928335ebcf73133775238d7dba18dd5"} Apr 22 18:46:39.283636 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.283565 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"1853ab5084b0fac9a379c208090012cbfad22b5be6382f37c57fbe6266bd0768"} Apr 22 18:46:39.283636 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.283577 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"75fb87bd6cdb7a7c73fa02cc7330efc37d80622cbc46d099421fe7c216316286"} Apr 22 18:46:39.838143 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:39.838117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:39.838286 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:39.838262 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:39.838355 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:39.838344 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:55.838323972 +0000 UTC m=+34.176309795 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:40.039100 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:40.039062 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:40.039302 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:40.039279 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:40.039373 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:40.039306 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:40.039373 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:40.039316 2572 projected.go:194] Error preparing data for projected volume kube-api-access-hn9ch for pod openshift-network-diagnostics/network-check-target-h9t7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:40.039473 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:40.039412 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch podName:64bb8453-8f86-4f40-ab06-b6f7eb42265e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:56.039393726 +0000 UTC m=+34.377379553 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hn9ch" (UniqueName: "kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch") pod "network-check-target-h9t7j" (UID: "64bb8453-8f86-4f40-ab06-b6f7eb42265e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:40.201399 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:40.201316 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:40.201399 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:40.201337 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:40.201610 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:40.201443 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:40.201610 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:40.201574 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:40.287281 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:40.287224 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" event={"ID":"fc61b10e-6ece-4657-9ee5-9f863cd9a3d9","Type":"ContainerStarted","Data":"15caeb227630dbf644f932046f440c211af8ffbb55b557a6e36eb5c3167994a7"} Apr 22 18:46:40.303217 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:40.302153 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8xvnd" podStartSLOduration=3.443812373 podStartE2EDuration="18.302135062s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.891509635 +0000 UTC m=+3.229495470" lastFinishedPulling="2026-04-22 18:46:39.749832323 +0000 UTC m=+18.087818159" observedRunningTime="2026-04-22 18:46:40.299960096 +0000 UTC m=+18.637945940" watchObservedRunningTime="2026-04-22 18:46:40.302135062 +0000 UTC m=+18.640120905" Apr 22 18:46:42.202036 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:42.202008 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:42.202613 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:42.202104 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:42.202613 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:42.202180 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:42.202613 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:42.202270 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:43.675004 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.674974 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-f4bd6"] Apr 22 18:46:43.734581 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.734548 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:43.734771 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:43.734635 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:43.866335 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.866304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-dbus\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:43.866335 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.866340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-kubelet-config\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:43.866549 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.866373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:43.967584 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.967513 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-dbus\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:43.967584 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.967549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-kubelet-config\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:43.967584 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.967574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:43.967816 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.967653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-kubelet-config\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:43.967816 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:43.967670 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:43.967816 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:43.967745 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret podName:a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:44.467727224 +0000 UTC m=+22.805713056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret") pod "global-pull-secret-syncer-f4bd6" (UID: "a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:43.967816 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:43.967786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-dbus\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:44.200536 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:44.200498 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:44.200728 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:44.200502 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:44.200728 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:44.200621 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:44.203179 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:44.201502 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:44.471912 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:44.471861 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:44.472177 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:44.472031 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:44.472177 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:44.472110 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret podName:a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:45.472086872 +0000 UTC m=+23.810072710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret") pod "global-pull-secret-syncer-f4bd6" (UID: "a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:45.200953 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:45.200915 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:45.201373 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:45.201034 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:45.480514 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:45.480436 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:45.480682 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:45.480562 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:45.480682 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:45.480628 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret podName:a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:47.480609423 +0000 UTC m=+25.818595241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret") pod "global-pull-secret-syncer-f4bd6" (UID: "a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:46.200659 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:46.200618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:46.200659 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:46.200653 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:46.200847 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:46.200772 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:46.200936 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:46.200914 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:46.300308 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:46.300219 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"5169e28ee13ce7ccd061a9f0e5d76ee3630f1d304a8e6b46ad4320f6ea54f540"} Apr 22 18:46:46.301403 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:46.301379 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nnfk2" event={"ID":"26feee8a-9b45-4708-a356-fcabada1a28c","Type":"ContainerStarted","Data":"22e415794f41ed56cbab00d18c748ec08b117b026b24842042b1a4a219de9527"} Apr 22 18:46:46.302873 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:46.302851 2572 generic.go:358] "Generic (PLEG): container finished" podID="bcd75ef0-f7af-4a32-b19b-aa29b44cd391" containerID="19a2dac48985f9ffec95543e9ffa36d28c09d70740054410300b6bfd9a754f42" exitCode=0 Apr 22 18:46:46.302979 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:46.302887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerDied","Data":"19a2dac48985f9ffec95543e9ffa36d28c09d70740054410300b6bfd9a754f42"} Apr 22 18:46:46.317719 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:46.317672 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nnfk2" podStartSLOduration=3.277280029 podStartE2EDuration="24.317659789s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.903583895 +0000 UTC m=+3.241569716" lastFinishedPulling="2026-04-22 18:46:45.943963653 +0000 UTC m=+24.281949476" observedRunningTime="2026-04-22 18:46:46.316849568 +0000 UTC m=+24.654835405" watchObservedRunningTime="2026-04-22 18:46:46.317659789 +0000 UTC m=+24.655645681" Apr 22 18:46:47.201150 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:47.200966 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:47.201300 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:47.201220 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:47.305700 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:47.305668 2572 generic.go:358] "Generic (PLEG): container finished" podID="bcd75ef0-f7af-4a32-b19b-aa29b44cd391" containerID="14b3bdeff414d88d00b925d4cbf0c7539df53aab4ac4a74775d0cf65b713108a" exitCode=0 Apr 22 18:46:47.306075 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:47.305754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerDied","Data":"14b3bdeff414d88d00b925d4cbf0c7539df53aab4ac4a74775d0cf65b713108a"} Apr 22 18:46:47.496860 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:47.496831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:47.497025 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:47.496937 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:47.497025 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:47.496983 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret podName:a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:51.496969027 +0000 UTC m=+29.834954847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret") pod "global-pull-secret-syncer-f4bd6" (UID: "a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:48.200969 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.200847 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:48.201107 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:48.200993 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:48.201107 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.201015 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:48.201399 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:48.201115 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:48.309503 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.309468 2572 generic.go:358] "Generic (PLEG): container finished" podID="bcd75ef0-f7af-4a32-b19b-aa29b44cd391" containerID="02677e893789467a16f5fedfeb513188a0e3fcc07633f6148df19fe86b1c7a3b" exitCode=0 Apr 22 18:46:48.309989 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.309539 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerDied","Data":"02677e893789467a16f5fedfeb513188a0e3fcc07633f6148df19fe86b1c7a3b"} Apr 22 18:46:48.312748 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.312724 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" event={"ID":"1c76f00a-74ae-463c-9b29-4b39f9d6a26d","Type":"ContainerStarted","Data":"573bb4dccca7de05928c92245c8d89d18308174f0e053edf4883eb1f20a2f541"} Apr 22 18:46:48.313039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.313005 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:48.313039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.313030 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:48.328128 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.328105 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:48.351502 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:48.351456 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" podStartSLOduration=12.989311566 podStartE2EDuration="26.351445295s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.893122698 +0000 UTC m=+3.231108516" lastFinishedPulling="2026-04-22 18:46:38.255256412 +0000 UTC m=+16.593242245" observedRunningTime="2026-04-22 18:46:48.350996417 +0000 UTC m=+26.688982258" watchObservedRunningTime="2026-04-22 18:46:48.351445295 +0000 UTC m=+26.689431135" Apr 22 18:46:49.200917 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:49.200869 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:49.201116 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:49.201008 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:49.315213 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:49.315180 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:49.335001 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:49.334924 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:46:50.200743 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:50.200704 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:50.200947 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:50.200714 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:50.200947 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:50.200840 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:50.201070 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:50.200954 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:51.201388 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:51.201354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:51.201830 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:51.201472 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:51.528551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:51.528515 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:51.528729 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:51.528673 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:51.528801 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:51.528750 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret podName:a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.528727415 +0000 UTC m=+37.866713247 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret") pod "global-pull-secret-syncer-f4bd6" (UID: "a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:52.201337 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:52.201307 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:52.201538 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:52.201409 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:52.201538 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:52.201501 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:52.201947 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:52.201635 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:53.200877 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:53.200843 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:53.201057 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:53.200982 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:53.458650 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:53.458568 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:53.459061 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:53.458721 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:46:53.459146 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:53.459127 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2gt98" Apr 22 18:46:54.200776 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:54.200743 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:54.201042 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:54.200846 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:54.201042 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:54.200876 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:54.201042 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:54.200946 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:54.574356 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:54.574280 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f4bd6"] Apr 22 18:46:54.575204 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:54.574426 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:54.575204 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:54.574535 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:54.577052 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:54.577026 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h9t7j"] Apr 22 18:46:54.577225 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:54.577114 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:54.577347 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:54.577218 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:54.577798 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:54.577772 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8xjpc"] Apr 22 18:46:54.577919 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:54.577855 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:54.577991 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:54.577970 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:55.329731 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:55.329699 2572 generic.go:358] "Generic (PLEG): container finished" podID="bcd75ef0-f7af-4a32-b19b-aa29b44cd391" containerID="d065bef03e190198edf8d353a53a7a94d2e505e8553b665029b0c08df31df380" exitCode=0 Apr 22 18:46:55.329879 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:55.329739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerDied","Data":"d065bef03e190198edf8d353a53a7a94d2e505e8553b665029b0c08df31df380"} Apr 22 18:46:55.864661 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:55.864629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:55.865228 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:55.864758 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:55.865228 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:55.864814 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:27.864798339 +0000 UTC m=+66.202784164 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:56.066918 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:56.066867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:56.067071 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:56.067011 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:56.067071 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:56.067026 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:56.067071 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:56.067035 2572 projected.go:194] Error preparing data for projected volume kube-api-access-hn9ch for pod openshift-network-diagnostics/network-check-target-h9t7j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.067170 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:56.067086 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch podName:64bb8453-8f86-4f40-ab06-b6f7eb42265e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:28.067070801 +0000 UTC m=+66.405056638 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hn9ch" (UniqueName: "kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch") pod "network-check-target-h9t7j" (UID: "64bb8453-8f86-4f40-ab06-b6f7eb42265e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:56.200596 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:56.200527 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:56.200596 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:56.200555 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:56.200793 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:56.200617 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:56.200793 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:56.200623 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:56.200793 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:56.200697 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:56.200793 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:56.200768 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:56.333950 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:56.333920 2572 generic.go:358] "Generic (PLEG): container finished" podID="bcd75ef0-f7af-4a32-b19b-aa29b44cd391" containerID="d6982850a189af59740b55453c8d6b4ec2ab09b2cae3fe7d4ea66258a71fcba6" exitCode=0 Apr 22 18:46:56.334089 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:56.333970 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerDied","Data":"d6982850a189af59740b55453c8d6b4ec2ab09b2cae3fe7d4ea66258a71fcba6"} Apr 22 18:46:57.338785 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:57.338607 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fsns5" event={"ID":"bcd75ef0-f7af-4a32-b19b-aa29b44cd391","Type":"ContainerStarted","Data":"8a2675d36d144f9ab6789db9913063e9eb94d896430c3a3395215f628ef3e40c"} Apr 22 18:46:57.359420 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:57.359354 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fsns5" podStartSLOduration=6.04979028 podStartE2EDuration="35.359340984s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:24.900856335 +0000 UTC m=+3.238842165" lastFinishedPulling="2026-04-22 18:46:54.210407036 +0000 UTC m=+32.548392869" observedRunningTime="2026-04-22 18:46:57.358811004 +0000 UTC m=+35.696796846" watchObservedRunningTime="2026-04-22 18:46:57.359340984 +0000 UTC m=+35.697326824" Apr 22 18:46:58.200710 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.200672 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:58.200710 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.200692 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:46:58.200971 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.200779 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-f4bd6" podUID="a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d" Apr 22 18:46:58.200971 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.200856 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:46:58.200971 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.200921 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h9t7j" podUID="64bb8453-8f86-4f40-ab06-b6f7eb42265e" Apr 22 18:46:58.200971 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.200848 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:46:58.529733 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.529705 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-84.ec2.internal" event="NodeReady" Apr 22 18:46:58.530165 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.529824 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:46:58.562209 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.562186 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx"] Apr 22 18:46:58.585382 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.585351 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb"] Apr 22 18:46:58.585542 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.585515 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" Apr 22 18:46:58.590581 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.588129 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:46:58.590581 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.588874 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-f8dgs\"" Apr 22 18:46:58.590581 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.589073 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:46:58.590581 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.589182 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:46:58.590581 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.589198 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:46:58.604233 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.604209 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw"] Apr 22 18:46:58.604364 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.604347 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.606341 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.606320 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:46:58.628738 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.628714 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66bff9f979-6th2g"] Apr 22 18:46:58.628848 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.628835 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.630870 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.630854 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:46:58.630973 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.630922 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:46:58.631023 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.630997 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:46:58.631112 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.631068 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:46:58.651159 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.651136 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k779l"] Apr 22 18:46:58.651278 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.651263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.653519 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.653499 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:46:58.653621 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.653592 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:46:58.653804 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.653790 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8zvm4\"" Apr 22 18:46:58.653928 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.653892 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:46:58.657663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.657646 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:46:58.669597 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.669578 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb"] Apr 22 18:46:58.669676 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.669601 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx"] Apr 22 18:46:58.669676 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.669613 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9mmmk"] Apr 22 18:46:58.669762 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.669749 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.671826 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.671654 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:46:58.671826 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.671738 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:46:58.671826 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.671739 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctxn4\"" Apr 22 18:46:58.686867 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.686841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h87k4\" (UniqueName: \"kubernetes.io/projected/f6f853f9-1ef8-49e6-870b-b177504bcdc3-kube-api-access-h87k4\") pod \"managed-serviceaccount-addon-agent-7d9885797c-zmmtx\" (UID: \"f6f853f9-1ef8-49e6-870b-b177504bcdc3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" Apr 22 18:46:58.686970 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.686879 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f6f853f9-1ef8-49e6-870b-b177504bcdc3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d9885797c-zmmtx\" (UID: \"f6f853f9-1ef8-49e6-870b-b177504bcdc3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" Apr 22 18:46:58.686970 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.686956 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86f416fa-d29f-410c-b2a3-3d88449ed7f2-tmp\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.687051 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.686982 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxq4\" (UniqueName: \"kubernetes.io/projected/86f416fa-d29f-410c-b2a3-3d88449ed7f2-kube-api-access-hxxq4\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.687051 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.687019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/86f416fa-d29f-410c-b2a3-3d88449ed7f2-klusterlet-config\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.696658 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.696640 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66bff9f979-6th2g"] Apr 22 18:46:58.696729 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.696662 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k779l"] Apr 22 18:46:58.696729 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.696670 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw"] Apr 22 18:46:58.696729 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.696678 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9mmmk"] Apr 22 18:46:58.696826 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.696782 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:46:58.698676 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.698659 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:46:58.698761 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.698713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:46:58.698761 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.698735 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdfh7\"" Apr 22 18:46:58.698864 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.698773 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:46:58.787663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787580 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h87k4\" (UniqueName: \"kubernetes.io/projected/f6f853f9-1ef8-49e6-870b-b177504bcdc3-kube-api-access-h87k4\") pod \"managed-serviceaccount-addon-agent-7d9885797c-zmmtx\" (UID: \"f6f853f9-1ef8-49e6-870b-b177504bcdc3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" Apr 22 18:46:58.787663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787615 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.787663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-bound-sa-token\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.787663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787664 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zwvh\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-kube-api-access-7zwvh\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-tmp-dir\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787700 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-image-registry-private-configuration\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787716 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a464150c-2bac-4805-912c-2e3402d480c8-ca-trust-extracted\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-trusted-ca\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f6f853f9-1ef8-49e6-870b-b177504bcdc3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d9885797c-zmmtx\" (UID: \"f6f853f9-1ef8-49e6-870b-b177504bcdc3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-ca\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-installation-pull-secrets\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787892 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-hub\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-config-volume\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.787999 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.787974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788010 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-registry-certificates\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwgc5\" (UniqueName: \"kubernetes.io/projected/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-kube-api-access-wwgc5\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86f416fa-d29f-410c-b2a3-3d88449ed7f2-tmp\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxq4\" (UniqueName: \"kubernetes.io/projected/86f416fa-d29f-410c-b2a3-3d88449ed7f2-kube-api-access-hxxq4\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/86f416fa-d29f-410c-b2a3-3d88449ed7f2-klusterlet-config\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/10f5468e-4567-428e-b0d6-41836a15fb80-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jjk\" (UniqueName: \"kubernetes.io/projected/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-kube-api-access-44jjk\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788471 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnq74\" (UniqueName: \"kubernetes.io/projected/10f5468e-4567-428e-b0d6-41836a15fb80-kube-api-access-tnq74\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.788540 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.788497 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86f416fa-d29f-410c-b2a3-3d88449ed7f2-tmp\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.792011 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.791989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f6f853f9-1ef8-49e6-870b-b177504bcdc3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7d9885797c-zmmtx\" (UID: \"f6f853f9-1ef8-49e6-870b-b177504bcdc3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" Apr 22 18:46:58.792110 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.791995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/86f416fa-d29f-410c-b2a3-3d88449ed7f2-klusterlet-config\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.795297 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.795273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxq4\" (UniqueName: \"kubernetes.io/projected/86f416fa-d29f-410c-b2a3-3d88449ed7f2-kube-api-access-hxxq4\") pod \"klusterlet-addon-workmgr-868778d8f4-z9vkb\" (UID: \"86f416fa-d29f-410c-b2a3-3d88449ed7f2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.795395 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.795375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h87k4\" (UniqueName: \"kubernetes.io/projected/f6f853f9-1ef8-49e6-870b-b177504bcdc3-kube-api-access-h87k4\") pod \"managed-serviceaccount-addon-agent-7d9885797c-zmmtx\" (UID: \"f6f853f9-1ef8-49e6-870b-b177504bcdc3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" Apr 22 18:46:58.889366 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.889336 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-image-registry-private-configuration\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.889366 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.889368 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a464150c-2bac-4805-912c-2e3402d480c8-ca-trust-extracted\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.889558 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.889385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-trusted-ca\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.889558 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.889406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-ca\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.889558 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.889431 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-installation-pull-secrets\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.889558 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.889455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-hub\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.889558 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.889479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-config-volume\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.889771 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.889662 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:58.890657 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.890381 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls podName:9151f183-6814-4a15-b1a7-9bd9ce7b5c59 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.390347295 +0000 UTC m=+37.728333123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls") pod "dns-default-k779l" (UID: "9151f183-6814-4a15-b1a7-9bd9ce7b5c59") : secret "dns-default-metrics-tls" not found Apr 22 18:46:58.890657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a464150c-2bac-4805-912c-2e3402d480c8-ca-trust-extracted\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.890657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.889501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.890657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:46:58.890657 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-config-volume\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.890657 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.890555 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:58.890657 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.890601 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert podName:1ddee21f-46d4-45d6-bdfe-9fcc6baf236b nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.390588377 +0000 UTC m=+37.728574199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert") pod "ingress-canary-9mmmk" (UID: "1ddee21f-46d4-45d6-bdfe-9fcc6baf236b") : secret "canary-serving-cert" not found Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-registry-certificates\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwgc5\" (UniqueName: \"kubernetes.io/projected/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-kube-api-access-wwgc5\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890863 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/10f5468e-4567-428e-b0d6-41836a15fb80-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890985 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44jjk\" (UniqueName: \"kubernetes.io/projected/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-kube-api-access-44jjk\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.890987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-trusted-ca\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.891038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnq74\" (UniqueName: \"kubernetes.io/projected/10f5468e-4567-428e-b0d6-41836a15fb80-kube-api-access-tnq74\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.891106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.891137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-bound-sa-token\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.891169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zwvh\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-kube-api-access-7zwvh\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.891201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-tmp-dir\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.891380 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-registry-certificates\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.891494 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-tmp-dir\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.891503 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:58.892149 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.891519 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66bff9f979-6th2g: secret "image-registry-tls" not found Apr 22 18:46:58.892809 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:58.891602 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls podName:a464150c-2bac-4805-912c-2e3402d480c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.391577946 +0000 UTC m=+37.729563772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls") pod "image-registry-66bff9f979-6th2g" (UID: "a464150c-2bac-4805-912c-2e3402d480c8") : secret "image-registry-tls" not found Apr 22 18:46:58.892809 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.891625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/10f5468e-4567-428e-b0d6-41836a15fb80-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.892956 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.892936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-installation-pull-secrets\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.893088 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.893066 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-hub\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.893929 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.893385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-image-registry-private-configuration\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.896151 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.896132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-ca\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.897148 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.897123 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.897320 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.897304 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/10f5468e-4567-428e-b0d6-41836a15fb80-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.899934 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.899835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jjk\" (UniqueName: \"kubernetes.io/projected/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-kube-api-access-44jjk\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:58.900141 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.900117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-bound-sa-token\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.900398 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.900367 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zwvh\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-kube-api-access-7zwvh\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:58.900468 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.900418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwgc5\" (UniqueName: \"kubernetes.io/projected/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-kube-api-access-wwgc5\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:46:58.900677 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.900656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnq74\" (UniqueName: \"kubernetes.io/projected/10f5468e-4567-428e-b0d6-41836a15fb80-kube-api-access-tnq74\") pod \"cluster-proxy-proxy-agent-8694f9ff5b-rnrkw\" (UID: \"10f5468e-4567-428e-b0d6-41836a15fb80\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:58.907774 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.907756 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" Apr 22 18:46:58.914499 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.914479 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:46:58.936458 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:58.936427 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:46:59.102839 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.102809 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx"] Apr 22 18:46:59.105835 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.105814 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb"] Apr 22 18:46:59.106713 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.106688 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw"] Apr 22 18:46:59.115453 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:59.115429 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6f853f9_1ef8_49e6_870b_b177504bcdc3.slice/crio-c7bcca9d35b3d4df707bfbfeb02705b9f63171048ce539eeb5f5a099c6c6f2b9 WatchSource:0}: Error finding container c7bcca9d35b3d4df707bfbfeb02705b9f63171048ce539eeb5f5a099c6c6f2b9: Status 404 returned error can't find the container with id c7bcca9d35b3d4df707bfbfeb02705b9f63171048ce539eeb5f5a099c6c6f2b9 Apr 22 18:46:59.115650 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:59.115628 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10f5468e_4567_428e_b0d6_41836a15fb80.slice/crio-72e2bb818462fbc8b65f42b7919f44b55a73abde0936c137d61e591b06206677 WatchSource:0}: Error finding container 72e2bb818462fbc8b65f42b7919f44b55a73abde0936c137d61e591b06206677: Status 404 returned error can't find the container with id 72e2bb818462fbc8b65f42b7919f44b55a73abde0936c137d61e591b06206677 Apr 22 18:46:59.132261 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:46:59.132235 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f416fa_d29f_410c_b2a3_3d88449ed7f2.slice/crio-fe5a18252ca1c519ffbc92cf43463e78094a27e593b6b4c859f7dff5bcf46242 WatchSource:0}: Error finding container fe5a18252ca1c519ffbc92cf43463e78094a27e593b6b4c859f7dff5bcf46242: Status 404 returned error can't find the container with id fe5a18252ca1c519ffbc92cf43463e78094a27e593b6b4c859f7dff5bcf46242 Apr 22 18:46:59.344014 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.343927 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" event={"ID":"10f5468e-4567-428e-b0d6-41836a15fb80","Type":"ContainerStarted","Data":"72e2bb818462fbc8b65f42b7919f44b55a73abde0936c137d61e591b06206677"} Apr 22 18:46:59.344817 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.344789 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" event={"ID":"86f416fa-d29f-410c-b2a3-3d88449ed7f2","Type":"ContainerStarted","Data":"fe5a18252ca1c519ffbc92cf43463e78094a27e593b6b4c859f7dff5bcf46242"} Apr 22 18:46:59.345667 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.345648 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" event={"ID":"f6f853f9-1ef8-49e6-870b-b177504bcdc3","Type":"ContainerStarted","Data":"c7bcca9d35b3d4df707bfbfeb02705b9f63171048ce539eeb5f5a099c6c6f2b9"} Apr 22 18:46:59.396045 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.396023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:46:59.396157 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.396050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:46:59.396199 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.396158 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:46:59.396199 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.396162 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:46:59.396269 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.396201 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert podName:1ddee21f-46d4-45d6-bdfe-9fcc6baf236b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:00.396187634 +0000 UTC m=+38.734173453 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert") pod "ingress-canary-9mmmk" (UID: "1ddee21f-46d4-45d6-bdfe-9fcc6baf236b") : secret "canary-serving-cert" not found Apr 22 18:46:59.396269 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.396225 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls podName:9151f183-6814-4a15-b1a7-9bd9ce7b5c59 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:00.396206291 +0000 UTC m=+38.734192125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls") pod "dns-default-k779l" (UID: "9151f183-6814-4a15-b1a7-9bd9ce7b5c59") : secret "dns-default-metrics-tls" not found Apr 22 18:46:59.396269 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.396251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:46:59.396385 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.396357 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:46:59.396385 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.396366 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66bff9f979-6th2g: secret "image-registry-tls" not found Apr 22 18:46:59.396444 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.396398 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls podName:a464150c-2bac-4805-912c-2e3402d480c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:00.396388172 +0000 UTC m=+38.734374004 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls") pod "image-registry-66bff9f979-6th2g" (UID: "a464150c-2bac-4805-912c-2e3402d480c8") : secret "image-registry-tls" not found Apr 22 18:46:59.597603 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:46:59.597526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:46:59.598167 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.597678 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:59.598167 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:46:59.597741 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret podName:a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d nodeName:}" failed. No retries permitted until 2026-04-22 18:47:15.597727767 +0000 UTC m=+53.935713590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret") pod "global-pull-secret-syncer-f4bd6" (UID: "a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:00.203349 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.202104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:47:00.203573 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.203494 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:47:00.204408 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.203885 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:47:00.206602 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.206195 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:00.206602 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.206440 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:47:00.208526 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.208310 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhrzn\"" Apr 22 18:47:00.208526 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.208320 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:00.208685 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.208661 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9rzz8\"" Apr 22 18:47:00.209758 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.208852 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:00.405631 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.405594 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:47:00.405833 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.405645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:47:00.405833 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:00.405734 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:47:00.405833 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:00.405764 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:00.406223 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:00.405852 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls podName:9151f183-6814-4a15-b1a7-9bd9ce7b5c59 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.40582987 +0000 UTC m=+40.743815694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls") pod "dns-default-k779l" (UID: "9151f183-6814-4a15-b1a7-9bd9ce7b5c59") : secret "dns-default-metrics-tls" not found Apr 22 18:47:00.406223 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:00.406164 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:00.406332 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:00.406250 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert podName:1ddee21f-46d4-45d6-bdfe-9fcc6baf236b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.406232656 +0000 UTC m=+40.744218477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert") pod "ingress-canary-9mmmk" (UID: "1ddee21f-46d4-45d6-bdfe-9fcc6baf236b") : secret "canary-serving-cert" not found Apr 22 18:47:00.406332 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:00.406268 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:00.406332 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:00.406284 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66bff9f979-6th2g: secret "image-registry-tls" not found Apr 22 18:47:00.406522 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:00.406334 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls podName:a464150c-2bac-4805-912c-2e3402d480c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.40631865 +0000 UTC m=+40.744304483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls") pod "image-registry-66bff9f979-6th2g" (UID: "a464150c-2bac-4805-912c-2e3402d480c8") : secret "image-registry-tls" not found Apr 22 18:47:02.427802 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:02.427593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:02.427828 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:02.427765 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:02.427929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:02.427978 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls podName:9151f183-6814-4a15-b1a7-9bd9ce7b5c59 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:06.427953857 +0000 UTC m=+44.765939689 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls") pod "dns-default-k779l" (UID: "9151f183-6814-4a15-b1a7-9bd9ce7b5c59") : secret "dns-default-metrics-tls" not found Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:02.428030 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:02.428093 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert podName:1ddee21f-46d4-45d6-bdfe-9fcc6baf236b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:06.428078707 +0000 UTC m=+44.766064542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert") pod "ingress-canary-9mmmk" (UID: "1ddee21f-46d4-45d6-bdfe-9fcc6baf236b") : secret "canary-serving-cert" not found Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:02.428033 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:02.428112 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66bff9f979-6th2g: secret "image-registry-tls" not found Apr 22 18:47:02.428241 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:02.428138 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls podName:a464150c-2bac-4805-912c-2e3402d480c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:06.428129476 +0000 UTC m=+44.766115302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls") pod "image-registry-66bff9f979-6th2g" (UID: "a464150c-2bac-4805-912c-2e3402d480c8") : secret "image-registry-tls" not found Apr 22 18:47:05.364731 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:05.364697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" event={"ID":"10f5468e-4567-428e-b0d6-41836a15fb80","Type":"ContainerStarted","Data":"f330980f018c7963de49de305fc3d1c9e4cd4b0d16049b4b778035d6d61f3918"} Apr 22 18:47:05.365930 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:05.365893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" event={"ID":"86f416fa-d29f-410c-b2a3-3d88449ed7f2","Type":"ContainerStarted","Data":"66c2717ded4ed54ceb63e68470acda6bfcea85bda07f31029212fcd87379645c"} Apr 22 18:47:05.366108 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:05.366090 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:47:05.367133 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:05.367114 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" event={"ID":"f6f853f9-1ef8-49e6-870b-b177504bcdc3","Type":"ContainerStarted","Data":"87b9a000238a93bc06001a0881b65e05eca23f5173751c71d790e4af4ece73d4"} Apr 22 18:47:05.367780 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:05.367760 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:47:05.380109 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:05.380064 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" podStartSLOduration=18.680314191 podStartE2EDuration="24.380051468s" podCreationTimestamp="2026-04-22 18:46:41 +0000 UTC" firstStartedPulling="2026-04-22 18:46:59.133718971 +0000 UTC m=+37.471704791" lastFinishedPulling="2026-04-22 18:47:04.833456244 +0000 UTC m=+43.171442068" observedRunningTime="2026-04-22 18:47:05.379615485 +0000 UTC m=+43.717601326" watchObservedRunningTime="2026-04-22 18:47:05.380051468 +0000 UTC m=+43.718037309" Apr 22 18:47:05.404642 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:05.404605 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" podStartSLOduration=18.703514033 podStartE2EDuration="24.404595016s" podCreationTimestamp="2026-04-22 18:46:41 +0000 UTC" firstStartedPulling="2026-04-22 18:46:59.117108081 +0000 UTC m=+37.455093900" lastFinishedPulling="2026-04-22 18:47:04.818189048 +0000 UTC m=+43.156174883" observedRunningTime="2026-04-22 18:47:05.40448136 +0000 UTC m=+43.742467202" watchObservedRunningTime="2026-04-22 18:47:05.404595016 +0000 UTC m=+43.742580869" Apr 22 18:47:06.461190 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:06.461156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:06.461247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:06.461273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:06.461322 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:06.461349 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66bff9f979-6th2g: secret "image-registry-tls" not found Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:06.461368 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:06.461421 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:06.461428 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls podName:a464150c-2bac-4805-912c-2e3402d480c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:14.461409073 +0000 UTC m=+52.799394901 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls") pod "image-registry-66bff9f979-6th2g" (UID: "a464150c-2bac-4805-912c-2e3402d480c8") : secret "image-registry-tls" not found Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:06.461505 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls podName:9151f183-6814-4a15-b1a7-9bd9ce7b5c59 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:14.46148487 +0000 UTC m=+52.799470692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls") pod "dns-default-k779l" (UID: "9151f183-6814-4a15-b1a7-9bd9ce7b5c59") : secret "dns-default-metrics-tls" not found Apr 22 18:47:06.461717 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:06.461522 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert podName:1ddee21f-46d4-45d6-bdfe-9fcc6baf236b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:14.461515947 +0000 UTC m=+52.799501766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert") pod "ingress-canary-9mmmk" (UID: "1ddee21f-46d4-45d6-bdfe-9fcc6baf236b") : secret "canary-serving-cert" not found Apr 22 18:47:07.373264 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:07.373187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" event={"ID":"10f5468e-4567-428e-b0d6-41836a15fb80","Type":"ContainerStarted","Data":"89cdb3e096d64f93d54f9f073edd7cbdc9b4302b88e0741905ed1b3c2856115a"} Apr 22 18:47:07.373264 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:07.373224 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" event={"ID":"10f5468e-4567-428e-b0d6-41836a15fb80","Type":"ContainerStarted","Data":"f37530e2278d64d742dbae03b8d39847dcce9cb8963f8fb9d6fa4deebc515d76"} Apr 22 18:47:07.390853 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:07.390808 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" podStartSLOduration=18.474737287 podStartE2EDuration="26.390795398s" podCreationTimestamp="2026-04-22 18:46:41 +0000 UTC" firstStartedPulling="2026-04-22 18:46:59.126859863 +0000 UTC m=+37.464845683" lastFinishedPulling="2026-04-22 18:47:07.042917972 +0000 UTC m=+45.380903794" observedRunningTime="2026-04-22 18:47:07.390344541 +0000 UTC m=+45.728330386" watchObservedRunningTime="2026-04-22 18:47:07.390795398 +0000 UTC m=+45.728781238" Apr 22 18:47:14.520128 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:14.520094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:47:14.520128 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:14.520128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:47:14.520529 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:14.520182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:47:14.520529 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:14.520232 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:14.520529 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:14.520296 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:14.520529 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:14.520300 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls podName:9151f183-6814-4a15-b1a7-9bd9ce7b5c59 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:30.520282762 +0000 UTC m=+68.858268585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls") pod "dns-default-k779l" (UID: "9151f183-6814-4a15-b1a7-9bd9ce7b5c59") : secret "dns-default-metrics-tls" not found Apr 22 18:47:14.520529 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:14.520360 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert podName:1ddee21f-46d4-45d6-bdfe-9fcc6baf236b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:30.520347163 +0000 UTC m=+68.858332983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert") pod "ingress-canary-9mmmk" (UID: "1ddee21f-46d4-45d6-bdfe-9fcc6baf236b") : secret "canary-serving-cert" not found Apr 22 18:47:14.520529 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:14.520298 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:14.520529 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:14.520374 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66bff9f979-6th2g: secret "image-registry-tls" not found Apr 22 18:47:14.520529 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:14.520410 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls podName:a464150c-2bac-4805-912c-2e3402d480c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:30.520400123 +0000 UTC m=+68.858385942 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls") pod "image-registry-66bff9f979-6th2g" (UID: "a464150c-2bac-4805-912c-2e3402d480c8") : secret "image-registry-tls" not found Apr 22 18:47:15.626280 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:15.626228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:47:15.629608 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:15.629577 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d-original-pull-secret\") pod \"global-pull-secret-syncer-f4bd6\" (UID: \"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d\") " pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:47:15.834058 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:15.834025 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f4bd6" Apr 22 18:47:15.943752 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:15.943722 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f4bd6"] Apr 22 18:47:15.946937 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:47:15.946892 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ef7ff9_4ca4_496b_a2f3_fc3bd1ed7c4d.slice/crio-9bc38fb14abab4f4a2737dc41d8cb6e121d2822a731e222ab963adcfbb1495f0 WatchSource:0}: Error finding container 9bc38fb14abab4f4a2737dc41d8cb6e121d2822a731e222ab963adcfbb1495f0: Status 404 returned error can't find the container with id 9bc38fb14abab4f4a2737dc41d8cb6e121d2822a731e222ab963adcfbb1495f0 Apr 22 18:47:16.392941 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:16.392892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f4bd6" event={"ID":"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d","Type":"ContainerStarted","Data":"9bc38fb14abab4f4a2737dc41d8cb6e121d2822a731e222ab963adcfbb1495f0"} Apr 22 18:47:20.403757 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:20.403669 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f4bd6" event={"ID":"a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d","Type":"ContainerStarted","Data":"b5bb3a23b6212983c30a138cac40b6d7c8cf3a4f23975fa8698945a55267082a"} Apr 22 18:47:20.417489 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:20.417442 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-f4bd6" podStartSLOduration=33.264624821 podStartE2EDuration="37.417429348s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:47:15.948588901 +0000 UTC m=+54.286574720" lastFinishedPulling="2026-04-22 18:47:20.101393427 +0000 UTC m=+58.439379247" observedRunningTime="2026-04-22 18:47:20.416973062 +0000 UTC m=+58.754958903" watchObservedRunningTime="2026-04-22 18:47:20.417429348 +0000 UTC m=+58.755415186" Apr 22 18:47:21.337314 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:21.337277 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vtpk8" Apr 22 18:47:27.918889 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:27.918854 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:47:27.921271 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:27.921254 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:27.929965 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:27.929949 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:27.930031 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:27.929999 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:48:31.929983507 +0000 UTC m=+130.267969326 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : secret "metrics-daemon-secret" not found Apr 22 18:47:28.120125 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:28.120095 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:47:28.122481 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:28.122463 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:28.132794 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:28.132774 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:28.143852 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:28.143826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn9ch\" (UniqueName: \"kubernetes.io/projected/64bb8453-8f86-4f40-ab06-b6f7eb42265e-kube-api-access-hn9ch\") pod \"network-check-target-h9t7j\" (UID: \"64bb8453-8f86-4f40-ab06-b6f7eb42265e\") " pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:47:28.423761 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:28.423735 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhrzn\"" Apr 22 18:47:28.432336 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:28.432318 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:47:28.544565 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:28.544533 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h9t7j"] Apr 22 18:47:28.547579 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:47:28.547549 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64bb8453_8f86_4f40_ab06_b6f7eb42265e.slice/crio-a54d54504aa1f80620f5d84125739b28880433d926d979bbf3e9468c59a372a3 WatchSource:0}: Error finding container a54d54504aa1f80620f5d84125739b28880433d926d979bbf3e9468c59a372a3: Status 404 returned error can't find the container with id a54d54504aa1f80620f5d84125739b28880433d926d979bbf3e9468c59a372a3 Apr 22 18:47:29.426157 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:29.426117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h9t7j" event={"ID":"64bb8453-8f86-4f40-ab06-b6f7eb42265e","Type":"ContainerStarted","Data":"a54d54504aa1f80620f5d84125739b28880433d926d979bbf3e9468c59a372a3"} Apr 22 18:47:30.539439 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:30.539404 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:30.539450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:30.539493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:30.539526 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:30.539581 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:30.539606 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls podName:9151f183-6814-4a15-b1a7-9bd9ce7b5c59 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:02.539582045 +0000 UTC m=+100.877567864 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls") pod "dns-default-k779l" (UID: "9151f183-6814-4a15-b1a7-9bd9ce7b5c59") : secret "dns-default-metrics-tls" not found Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:30.539608 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:30.539623 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66bff9f979-6th2g: secret "image-registry-tls" not found Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:30.539626 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert podName:1ddee21f-46d4-45d6-bdfe-9fcc6baf236b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:02.539616747 +0000 UTC m=+100.877602568 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert") pod "ingress-canary-9mmmk" (UID: "1ddee21f-46d4-45d6-bdfe-9fcc6baf236b") : secret "canary-serving-cert" not found Apr 22 18:47:30.539891 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:47:30.539680 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls podName:a464150c-2bac-4805-912c-2e3402d480c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:02.539661832 +0000 UTC m=+100.877647652 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls") pod "image-registry-66bff9f979-6th2g" (UID: "a464150c-2bac-4805-912c-2e3402d480c8") : secret "image-registry-tls" not found Apr 22 18:47:32.434051 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:32.434011 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h9t7j" event={"ID":"64bb8453-8f86-4f40-ab06-b6f7eb42265e","Type":"ContainerStarted","Data":"0c60495fae695ee03cdb1b3dfc1de2fbed4211f905702e6ba6a8091090020286"} Apr 22 18:47:32.434488 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:32.434174 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:47:32.447941 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:47:32.447868 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-h9t7j" podStartSLOduration=67.415060116 podStartE2EDuration="1m10.447855709s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:47:28.549629053 +0000 UTC m=+66.887614871" lastFinishedPulling="2026-04-22 18:47:31.582424644 +0000 UTC m=+69.920410464" observedRunningTime="2026-04-22 18:47:32.447253316 +0000 UTC m=+70.785239157" watchObservedRunningTime="2026-04-22 18:47:32.447855709 +0000 UTC m=+70.785841554" Apr 22 18:48:02.560028 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:48:02.559865 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:48:02.560028 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:48:02.559922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:48:02.560028 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:48:02.559989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:48:02.560028 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:02.560008 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:02.560679 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:02.560065 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:02.560679 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:02.560080 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls podName:9151f183-6814-4a15-b1a7-9bd9ce7b5c59 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:06.560058787 +0000 UTC m=+164.898044620 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls") pod "dns-default-k779l" (UID: "9151f183-6814-4a15-b1a7-9bd9ce7b5c59") : secret "dns-default-metrics-tls" not found Apr 22 18:48:02.560679 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:02.560092 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:02.560679 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:02.560107 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66bff9f979-6th2g: secret "image-registry-tls" not found Apr 22 18:48:02.560679 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:02.560117 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert podName:1ddee21f-46d4-45d6-bdfe-9fcc6baf236b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:06.56010099 +0000 UTC m=+164.898086809 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert") pod "ingress-canary-9mmmk" (UID: "1ddee21f-46d4-45d6-bdfe-9fcc6baf236b") : secret "canary-serving-cert" not found Apr 22 18:48:02.560679 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:02.560148 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls podName:a464150c-2bac-4805-912c-2e3402d480c8 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:06.560133757 +0000 UTC m=+164.898119589 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls") pod "image-registry-66bff9f979-6th2g" (UID: "a464150c-2bac-4805-912c-2e3402d480c8") : secret "image-registry-tls" not found Apr 22 18:48:03.439415 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:48:03.439386 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-h9t7j" Apr 22 18:48:29.143724 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:48:29.143696 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q65sk_86c95b36-8aa3-4c99-b0b6-3746cf836c8c/dns-node-resolver/0.log" Apr 22 18:48:29.943620 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:48:29.943594 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x6px5_53d298d6-4725-419c-b9f4-0f58a63b1715/node-ca/0.log" Apr 22 18:48:31.978692 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:48:31.978659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:48:31.979083 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:31.978768 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:48:31.979083 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:48:31.978835 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs podName:e28dd910-549e-488c-8e99-3ad3f1d11a5e nodeName:}" failed. No retries permitted until 2026-04-22 18:50:33.978815804 +0000 UTC m=+252.316801624 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs") pod "network-metrics-daemon-8xjpc" (UID: "e28dd910-549e-488c-8e99-3ad3f1d11a5e") : secret "metrics-daemon-secret" not found Apr 22 18:49:01.667826 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:49:01.667787 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" podUID="a464150c-2bac-4805-912c-2e3402d480c8" Apr 22 18:49:01.678027 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:49:01.678008 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-k779l" podUID="9151f183-6814-4a15-b1a7-9bd9ce7b5c59" Apr 22 18:49:01.705339 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:49:01.705301 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9mmmk" podUID="1ddee21f-46d4-45d6-bdfe-9fcc6baf236b" Apr 22 18:49:02.070639 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.070604 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5mzdz"] Apr 22 18:49:02.073771 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.073752 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.076167 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.076146 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:49:02.076632 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.076615 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:02.076632 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.076627 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:49:02.076774 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.076618 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:02.076774 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.076679 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6dkrn\"" Apr 22 18:49:02.088268 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.088243 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5mzdz"] Apr 22 18:49:02.189386 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.189343 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5453deb9-8135-4a14-8b07-385e16aad1aa-crio-socket\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.189573 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.189409 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5453deb9-8135-4a14-8b07-385e16aad1aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.189573 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.189447 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5453deb9-8135-4a14-8b07-385e16aad1aa-data-volume\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.189573 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.189472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbmbf\" (UniqueName: \"kubernetes.io/projected/5453deb9-8135-4a14-8b07-385e16aad1aa-kube-api-access-wbmbf\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.189573 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.189504 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5453deb9-8135-4a14-8b07-385e16aad1aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.290626 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.290587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5453deb9-8135-4a14-8b07-385e16aad1aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.290823 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.290646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5453deb9-8135-4a14-8b07-385e16aad1aa-data-volume\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.290823 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.290674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbmbf\" (UniqueName: \"kubernetes.io/projected/5453deb9-8135-4a14-8b07-385e16aad1aa-kube-api-access-wbmbf\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.290823 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.290706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5453deb9-8135-4a14-8b07-385e16aad1aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.290823 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.290792 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5453deb9-8135-4a14-8b07-385e16aad1aa-crio-socket\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.291020 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.290882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5453deb9-8135-4a14-8b07-385e16aad1aa-crio-socket\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.291098 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.291080 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5453deb9-8135-4a14-8b07-385e16aad1aa-data-volume\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.291225 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.291208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5453deb9-8135-4a14-8b07-385e16aad1aa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.293039 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.293020 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5453deb9-8135-4a14-8b07-385e16aad1aa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.298559 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.298538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbmbf\" (UniqueName: \"kubernetes.io/projected/5453deb9-8135-4a14-8b07-385e16aad1aa-kube-api-access-wbmbf\") pod \"insights-runtime-extractor-5mzdz\" (UID: \"5453deb9-8135-4a14-8b07-385e16aad1aa\") " pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.382290 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.382221 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5mzdz" Apr 22 18:49:02.495499 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.495466 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5mzdz"] Apr 22 18:49:02.498481 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:49:02.498448 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5453deb9_8135_4a14_8b07_385e16aad1aa.slice/crio-6a229ebee049d9184a1d03b9d2844ac5231854c122a2fd4bcd0952842745955b WatchSource:0}: Error finding container 6a229ebee049d9184a1d03b9d2844ac5231854c122a2fd4bcd0952842745955b: Status 404 returned error can't find the container with id 6a229ebee049d9184a1d03b9d2844ac5231854c122a2fd4bcd0952842745955b Apr 22 18:49:02.636396 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.636307 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5mzdz" event={"ID":"5453deb9-8135-4a14-8b07-385e16aad1aa","Type":"ContainerStarted","Data":"a5d7d82352e74b60149a5ed07faf7b403c7502f8e902be4b6fbd95cff70ba143"} Apr 22 18:49:02.636396 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.636349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5mzdz" event={"ID":"5453deb9-8135-4a14-8b07-385e16aad1aa","Type":"ContainerStarted","Data":"6a229ebee049d9184a1d03b9d2844ac5231854c122a2fd4bcd0952842745955b"} Apr 22 18:49:02.636396 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.636367 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k779l" Apr 22 18:49:02.636599 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:02.636373 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:49:03.242417 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:49:03.242327 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-8xjpc" podUID="e28dd910-549e-488c-8e99-3ad3f1d11a5e" Apr 22 18:49:03.640231 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:03.640196 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5mzdz" event={"ID":"5453deb9-8135-4a14-8b07-385e16aad1aa","Type":"ContainerStarted","Data":"67069fdda95f3463819cb7f38b6a765024417db4e51d62532d4057324a2c9f00"} Apr 22 18:49:05.367443 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.367384 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" podUID="86f416fa-d29f-410c-b2a3-3d88449ed7f2" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/readyz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 22 18:49:05.646219 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.646134 2572 generic.go:358] "Generic (PLEG): container finished" podID="86f416fa-d29f-410c-b2a3-3d88449ed7f2" containerID="66c2717ded4ed54ceb63e68470acda6bfcea85bda07f31029212fcd87379645c" exitCode=1 Apr 22 18:49:05.646219 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.646197 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" event={"ID":"86f416fa-d29f-410c-b2a3-3d88449ed7f2","Type":"ContainerDied","Data":"66c2717ded4ed54ceb63e68470acda6bfcea85bda07f31029212fcd87379645c"} Apr 22 18:49:05.646542 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.646519 2572 scope.go:117] "RemoveContainer" containerID="66c2717ded4ed54ceb63e68470acda6bfcea85bda07f31029212fcd87379645c" Apr 22 18:49:05.647575 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.647554 2572 generic.go:358] "Generic (PLEG): container finished" podID="f6f853f9-1ef8-49e6-870b-b177504bcdc3" containerID="87b9a000238a93bc06001a0881b65e05eca23f5173751c71d790e4af4ece73d4" exitCode=255 Apr 22 18:49:05.647673 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.647619 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" event={"ID":"f6f853f9-1ef8-49e6-870b-b177504bcdc3","Type":"ContainerDied","Data":"87b9a000238a93bc06001a0881b65e05eca23f5173751c71d790e4af4ece73d4"} Apr 22 18:49:05.647906 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.647879 2572 scope.go:117] "RemoveContainer" containerID="87b9a000238a93bc06001a0881b65e05eca23f5173751c71d790e4af4ece73d4" Apr 22 18:49:05.649820 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.649802 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5mzdz" event={"ID":"5453deb9-8135-4a14-8b07-385e16aad1aa","Type":"ContainerStarted","Data":"068521c4cd34536bef396bdd838959eed33ef244f4bd9683b3d13d1ba890312d"} Apr 22 18:49:05.687763 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:05.687709 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5mzdz" podStartSLOduration=1.608228346 podStartE2EDuration="3.687695425s" podCreationTimestamp="2026-04-22 18:49:02 +0000 UTC" firstStartedPulling="2026-04-22 18:49:02.553974047 +0000 UTC m=+160.891959867" lastFinishedPulling="2026-04-22 18:49:04.633441113 +0000 UTC m=+162.971426946" observedRunningTime="2026-04-22 18:49:05.686966384 +0000 UTC m=+164.024952226" watchObservedRunningTime="2026-04-22 18:49:05.687695425 +0000 UTC m=+164.025681266" Apr 22 18:49:06.628147 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.628091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:49:06.628147 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.628148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:49:06.628576 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.628175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:49:06.630400 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.630365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9151f183-6814-4a15-b1a7-9bd9ce7b5c59-metrics-tls\") pod \"dns-default-k779l\" (UID: \"9151f183-6814-4a15-b1a7-9bd9ce7b5c59\") " pod="openshift-dns/dns-default-k779l" Apr 22 18:49:06.630500 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.630443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ddee21f-46d4-45d6-bdfe-9fcc6baf236b-cert\") pod \"ingress-canary-9mmmk\" (UID: \"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b\") " pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:49:06.630547 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.630511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"image-registry-66bff9f979-6th2g\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:49:06.653671 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.653633 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" event={"ID":"86f416fa-d29f-410c-b2a3-3d88449ed7f2","Type":"ContainerStarted","Data":"cfb96245a7e7d2b4209311c6d9b932030c002af21eeed471ed02fed2c2def80e"} Apr 22 18:49:06.653965 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.653944 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:49:06.654570 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.654551 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-868778d8f4-z9vkb" Apr 22 18:49:06.655307 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.655288 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7d9885797c-zmmtx" event={"ID":"f6f853f9-1ef8-49e6-870b-b177504bcdc3","Type":"ContainerStarted","Data":"65bf192ba92a1a60214277eb947c2d6e282836e00a2fe5d7a8e911a33db2f69c"} Apr 22 18:49:06.839781 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.839751 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctxn4\"" Apr 22 18:49:06.839988 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.839792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8zvm4\"" Apr 22 18:49:06.848270 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.848241 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k779l" Apr 22 18:49:06.848395 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.848289 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:49:06.972232 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.972200 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k779l"] Apr 22 18:49:06.975046 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:49:06.975016 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9151f183_6814_4a15_b1a7_9bd9ce7b5c59.slice/crio-8653f2a84621bf3f4709fa277ad4bd3c76277692024b4153e20f144cf0cc00a4 WatchSource:0}: Error finding container 8653f2a84621bf3f4709fa277ad4bd3c76277692024b4153e20f144cf0cc00a4: Status 404 returned error can't find the container with id 8653f2a84621bf3f4709fa277ad4bd3c76277692024b4153e20f144cf0cc00a4 Apr 22 18:49:06.991743 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:06.991720 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66bff9f979-6th2g"] Apr 22 18:49:06.994855 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:49:06.994828 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda464150c_2bac_4805_912c_2e3402d480c8.slice/crio-d121f9da75b15adbbd8294b5723c9727354b32383e238fb498abae783d43a86a WatchSource:0}: Error finding container d121f9da75b15adbbd8294b5723c9727354b32383e238fb498abae783d43a86a: Status 404 returned error can't find the container with id d121f9da75b15adbbd8294b5723c9727354b32383e238fb498abae783d43a86a Apr 22 18:49:07.659109 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:07.659061 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k779l" event={"ID":"9151f183-6814-4a15-b1a7-9bd9ce7b5c59","Type":"ContainerStarted","Data":"8653f2a84621bf3f4709fa277ad4bd3c76277692024b4153e20f144cf0cc00a4"} Apr 22 18:49:07.660565 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:07.660524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" event={"ID":"a464150c-2bac-4805-912c-2e3402d480c8","Type":"ContainerStarted","Data":"01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8"} Apr 22 18:49:07.660565 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:07.660561 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" event={"ID":"a464150c-2bac-4805-912c-2e3402d480c8","Type":"ContainerStarted","Data":"d121f9da75b15adbbd8294b5723c9727354b32383e238fb498abae783d43a86a"} Apr 22 18:49:07.660828 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:07.660792 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:49:07.678338 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:07.678266 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" podStartSLOduration=165.678252356 podStartE2EDuration="2m45.678252356s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:49:07.677401342 +0000 UTC m=+166.015387184" watchObservedRunningTime="2026-04-22 18:49:07.678252356 +0000 UTC m=+166.016238199" Apr 22 18:49:08.666222 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.666189 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k779l" event={"ID":"9151f183-6814-4a15-b1a7-9bd9ce7b5c59","Type":"ContainerStarted","Data":"39e0c1f2ed0179629bbf9ed2bb4e5f013301f396c59eb156c57049ba0dd02fac"} Apr 22 18:49:08.969962 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.969929 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gzln9"] Apr 22 18:49:08.972849 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.972833 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:08.974829 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.974805 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:08.974829 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.974823 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:08.975007 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.974965 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:49:08.975155 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.975142 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:08.975512 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.975489 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:49:08.975639 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.975620 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:08.975856 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:08.975842 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nltc8\"" Apr 22 18:49:09.047917 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.047874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e048393-e835-467d-8f0d-48a2d41d7bcd-metrics-client-ca\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.048068 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.047924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-root\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.048068 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.047954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-textfile\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.048068 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.048003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-tls\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.048068 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.048028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-wtmp\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.048068 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.048046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75mjb\" (UniqueName: \"kubernetes.io/projected/8e048393-e835-467d-8f0d-48a2d41d7bcd-kube-api-access-75mjb\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.048227 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.048103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.048227 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.048139 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-accelerators-collector-config\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.048227 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.048220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-sys\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.148885 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.148853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-sys\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149056 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.148931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e048393-e835-467d-8f0d-48a2d41d7bcd-metrics-client-ca\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149056 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.148964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-root\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149056 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.148972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-sys\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149056 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.148990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-textfile\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149056 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-root\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149335 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-tls\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149335 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149152 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-wtmp\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149335 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75mjb\" (UniqueName: \"kubernetes.io/projected/8e048393-e835-467d-8f0d-48a2d41d7bcd-kube-api-access-75mjb\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149335 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149335 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-accelerators-collector-config\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149335 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-wtmp\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149581 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-textfile\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149631 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149616 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e048393-e835-467d-8f0d-48a2d41d7bcd-metrics-client-ca\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.149779 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.149761 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-accelerators-collector-config\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.151425 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.151404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.151524 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.151499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8e048393-e835-467d-8f0d-48a2d41d7bcd-node-exporter-tls\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.158461 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.158430 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75mjb\" (UniqueName: \"kubernetes.io/projected/8e048393-e835-467d-8f0d-48a2d41d7bcd-kube-api-access-75mjb\") pod \"node-exporter-gzln9\" (UID: \"8e048393-e835-467d-8f0d-48a2d41d7bcd\") " pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.281836 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.281811 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gzln9" Apr 22 18:49:09.289915 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:49:09.289877 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e048393_e835_467d_8f0d_48a2d41d7bcd.slice/crio-d88faf338c05da15732cfb5a14f5ceaf6363d4c92a9c0f4737a990bbbcf01e22 WatchSource:0}: Error finding container d88faf338c05da15732cfb5a14f5ceaf6363d4c92a9c0f4737a990bbbcf01e22: Status 404 returned error can't find the container with id d88faf338c05da15732cfb5a14f5ceaf6363d4c92a9c0f4737a990bbbcf01e22 Apr 22 18:49:09.671197 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.671110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k779l" event={"ID":"9151f183-6814-4a15-b1a7-9bd9ce7b5c59","Type":"ContainerStarted","Data":"6b42632bb7eace4e3fb7ac8be7ddfdfca49852a927268f4b404b61e3d491cec6"} Apr 22 18:49:09.671591 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.671240 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-k779l" Apr 22 18:49:09.672445 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.672409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gzln9" event={"ID":"8e048393-e835-467d-8f0d-48a2d41d7bcd","Type":"ContainerStarted","Data":"d88faf338c05da15732cfb5a14f5ceaf6363d4c92a9c0f4737a990bbbcf01e22"} Apr 22 18:49:09.689129 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:09.689078 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k779l" podStartSLOduration=130.198716506 podStartE2EDuration="2m11.689062654s" podCreationTimestamp="2026-04-22 18:46:58 +0000 UTC" firstStartedPulling="2026-04-22 18:49:06.976967406 +0000 UTC m=+165.314953225" lastFinishedPulling="2026-04-22 18:49:08.46731354 +0000 UTC m=+166.805299373" observedRunningTime="2026-04-22 18:49:09.688501353 +0000 UTC m=+168.026487195" watchObservedRunningTime="2026-04-22 18:49:09.689062654 +0000 UTC m=+168.027048500" Apr 22 18:49:10.676067 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:10.676034 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e048393-e835-467d-8f0d-48a2d41d7bcd" containerID="acf62f5dac480ea1b5e0aca8a8682de650cadd44eed91e9249d2e8eed2a93882" exitCode=0 Apr 22 18:49:10.676447 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:10.676120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gzln9" event={"ID":"8e048393-e835-467d-8f0d-48a2d41d7bcd","Type":"ContainerDied","Data":"acf62f5dac480ea1b5e0aca8a8682de650cadd44eed91e9249d2e8eed2a93882"} Apr 22 18:49:11.680556 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:11.680520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gzln9" event={"ID":"8e048393-e835-467d-8f0d-48a2d41d7bcd","Type":"ContainerStarted","Data":"89e9cea7f5daba06193caefc90451d2648bea2da9d0b47dcc09365b1ac364b96"} Apr 22 18:49:11.680556 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:11.680554 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gzln9" event={"ID":"8e048393-e835-467d-8f0d-48a2d41d7bcd","Type":"ContainerStarted","Data":"d93ff597d5023db893e7efacb59ede77067313dc1016aafd8e65bdc621f0e998"} Apr 22 18:49:11.698100 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:11.698048 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gzln9" podStartSLOduration=3.026025882 podStartE2EDuration="3.698035881s" podCreationTimestamp="2026-04-22 18:49:08 +0000 UTC" firstStartedPulling="2026-04-22 18:49:09.291517123 +0000 UTC m=+167.629502942" lastFinishedPulling="2026-04-22 18:49:09.963527119 +0000 UTC m=+168.301512941" observedRunningTime="2026-04-22 18:49:11.697533967 +0000 UTC m=+170.035519810" watchObservedRunningTime="2026-04-22 18:49:11.698035881 +0000 UTC m=+170.036021722" Apr 22 18:49:13.200730 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:13.200682 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:49:13.202818 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:13.202799 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdfh7\"" Apr 22 18:49:13.211295 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:13.211276 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9mmmk" Apr 22 18:49:13.323701 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:13.323674 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9mmmk"] Apr 22 18:49:13.327456 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:49:13.327430 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddee21f_46d4_45d6_bdfe_9fcc6baf236b.slice/crio-246bb0157b45dd94309c3d0bb2da84eabe3536439d67f36a1ac08b5374fe9cf3 WatchSource:0}: Error finding container 246bb0157b45dd94309c3d0bb2da84eabe3536439d67f36a1ac08b5374fe9cf3: Status 404 returned error can't find the container with id 246bb0157b45dd94309c3d0bb2da84eabe3536439d67f36a1ac08b5374fe9cf3 Apr 22 18:49:13.688423 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:13.688386 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9mmmk" event={"ID":"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b","Type":"ContainerStarted","Data":"246bb0157b45dd94309c3d0bb2da84eabe3536439d67f36a1ac08b5374fe9cf3"} Apr 22 18:49:15.694784 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:15.694745 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9mmmk" event={"ID":"1ddee21f-46d4-45d6-bdfe-9fcc6baf236b","Type":"ContainerStarted","Data":"5a701c9254d87ea17f36ebaaba5793ff5b0836008bf5c5d7ac339f9a5bacd2ec"} Apr 22 18:49:15.708194 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:15.708146 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9mmmk" podStartSLOduration=136.105433079 podStartE2EDuration="2m17.708130277s" podCreationTimestamp="2026-04-22 18:46:58 +0000 UTC" firstStartedPulling="2026-04-22 18:49:13.329339302 +0000 UTC m=+171.667325121" lastFinishedPulling="2026-04-22 18:49:14.932036497 +0000 UTC m=+173.270022319" observedRunningTime="2026-04-22 18:49:15.707260839 +0000 UTC m=+174.045246681" watchObservedRunningTime="2026-04-22 18:49:15.708130277 +0000 UTC m=+174.046116168" Apr 22 18:49:17.200554 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:17.200516 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:49:19.678506 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:19.678471 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k779l" Apr 22 18:49:24.434585 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:24.434552 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66bff9f979-6th2g"] Apr 22 18:49:24.438594 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:24.438567 2572 patch_prober.go:28] interesting pod/image-registry-66bff9f979-6th2g container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:49:24.438721 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:24.438612 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" podUID="a464150c-2bac-4805-912c-2e3402d480c8" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:49:34.439374 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:34.439345 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:49:48.938306 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:48.938263 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" podUID="10f5468e-4567-428e-b0d6-41836a15fb80" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:49:49.452862 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.452812 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" podUID="a464150c-2bac-4805-912c-2e3402d480c8" containerName="registry" containerID="cri-o://01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8" gracePeriod=30 Apr 22 18:49:49.699852 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.699828 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:49:49.781252 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.781222 2572 generic.go:358] "Generic (PLEG): container finished" podID="a464150c-2bac-4805-912c-2e3402d480c8" containerID="01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8" exitCode=0 Apr 22 18:49:49.781403 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.781262 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" event={"ID":"a464150c-2bac-4805-912c-2e3402d480c8","Type":"ContainerDied","Data":"01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8"} Apr 22 18:49:49.781403 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.781279 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" Apr 22 18:49:49.781403 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.781293 2572 scope.go:117] "RemoveContainer" containerID="01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8" Apr 22 18:49:49.781403 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.781283 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66bff9f979-6th2g" event={"ID":"a464150c-2bac-4805-912c-2e3402d480c8","Type":"ContainerDied","Data":"d121f9da75b15adbbd8294b5723c9727354b32383e238fb498abae783d43a86a"} Apr 22 18:49:49.788663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.788645 2572 scope.go:117] "RemoveContainer" containerID="01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8" Apr 22 18:49:49.788940 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:49:49.788917 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8\": container with ID starting with 01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8 not found: ID does not exist" containerID="01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8" Apr 22 18:49:49.789047 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.788944 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8"} err="failed to get container status \"01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8\": rpc error: code = NotFound desc = could not find container \"01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8\": container with ID starting with 01bc3f4d83aea4d61afcd29908b0669226bb795cd02c0d4db11b61d8b21f59f8 not found: ID does not exist" Apr 22 18:49:49.853254 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.853221 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-installation-pull-secrets\") pod \"a464150c-2bac-4805-912c-2e3402d480c8\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " Apr 22 18:49:49.853413 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.853275 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-trusted-ca\") pod \"a464150c-2bac-4805-912c-2e3402d480c8\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " Apr 22 18:49:49.853413 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.853297 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zwvh\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-kube-api-access-7zwvh\") pod \"a464150c-2bac-4805-912c-2e3402d480c8\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " Apr 22 18:49:49.853413 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.853315 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") pod \"a464150c-2bac-4805-912c-2e3402d480c8\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " Apr 22 18:49:49.853413 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.853339 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-registry-certificates\") pod \"a464150c-2bac-4805-912c-2e3402d480c8\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " Apr 22 18:49:49.853413 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.853357 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-bound-sa-token\") pod \"a464150c-2bac-4805-912c-2e3402d480c8\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " Apr 22 18:49:49.853413 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.853383 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a464150c-2bac-4805-912c-2e3402d480c8-ca-trust-extracted\") pod \"a464150c-2bac-4805-912c-2e3402d480c8\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " Apr 22 18:49:49.853930 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.853885 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-image-registry-private-configuration\") pod \"a464150c-2bac-4805-912c-2e3402d480c8\" (UID: \"a464150c-2bac-4805-912c-2e3402d480c8\") " Apr 22 18:49:49.856318 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.856287 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a464150c-2bac-4805-912c-2e3402d480c8" (UID: "a464150c-2bac-4805-912c-2e3402d480c8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:49.856717 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.856691 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a464150c-2bac-4805-912c-2e3402d480c8" (UID: "a464150c-2bac-4805-912c-2e3402d480c8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:49.857037 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.856975 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-kube-api-access-7zwvh" (OuterVolumeSpecName: "kube-api-access-7zwvh") pod "a464150c-2bac-4805-912c-2e3402d480c8" (UID: "a464150c-2bac-4805-912c-2e3402d480c8"). InnerVolumeSpecName "kube-api-access-7zwvh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:49.859755 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.857153 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zwvh\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-kube-api-access-7zwvh\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.859755 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.857173 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-registry-certificates\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.859755 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.857189 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a464150c-2bac-4805-912c-2e3402d480c8-trusted-ca\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.859978 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.859848 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a464150c-2bac-4805-912c-2e3402d480c8" (UID: "a464150c-2bac-4805-912c-2e3402d480c8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:49.859978 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.859856 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a464150c-2bac-4805-912c-2e3402d480c8" (UID: "a464150c-2bac-4805-912c-2e3402d480c8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:49.859978 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.859869 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a464150c-2bac-4805-912c-2e3402d480c8" (UID: "a464150c-2bac-4805-912c-2e3402d480c8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:49.860321 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.860303 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a464150c-2bac-4805-912c-2e3402d480c8" (UID: "a464150c-2bac-4805-912c-2e3402d480c8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:49.866771 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.866749 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a464150c-2bac-4805-912c-2e3402d480c8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a464150c-2bac-4805-912c-2e3402d480c8" (UID: "a464150c-2bac-4805-912c-2e3402d480c8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:49.957527 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.957496 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-registry-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.957527 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.957524 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a464150c-2bac-4805-912c-2e3402d480c8-bound-sa-token\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.957527 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.957535 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a464150c-2bac-4805-912c-2e3402d480c8-ca-trust-extracted\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.957939 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.957547 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-image-registry-private-configuration\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:49:49.957939 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:49.957557 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a464150c-2bac-4805-912c-2e3402d480c8-installation-pull-secrets\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:49:50.101736 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:50.101711 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66bff9f979-6th2g"] Apr 22 18:49:50.105331 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:50.105309 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66bff9f979-6th2g"] Apr 22 18:49:50.204504 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:50.204466 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a464150c-2bac-4805-912c-2e3402d480c8" path="/var/lib/kubelet/pods/a464150c-2bac-4805-912c-2e3402d480c8/volumes" Apr 22 18:49:58.937502 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:49:58.937459 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" podUID="10f5468e-4567-428e-b0d6-41836a15fb80" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:50:08.937828 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:08.937787 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" podUID="10f5468e-4567-428e-b0d6-41836a15fb80" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:50:08.938268 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:08.937856 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" Apr 22 18:50:08.938379 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:08.938360 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"89cdb3e096d64f93d54f9f073edd7cbdc9b4302b88e0741905ed1b3c2856115a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:50:08.938417 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:08.938400 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" podUID="10f5468e-4567-428e-b0d6-41836a15fb80" containerName="service-proxy" containerID="cri-o://89cdb3e096d64f93d54f9f073edd7cbdc9b4302b88e0741905ed1b3c2856115a" gracePeriod=30 Apr 22 18:50:09.838620 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:09.838583 2572 generic.go:358] "Generic (PLEG): container finished" podID="10f5468e-4567-428e-b0d6-41836a15fb80" containerID="89cdb3e096d64f93d54f9f073edd7cbdc9b4302b88e0741905ed1b3c2856115a" exitCode=2 Apr 22 18:50:09.838790 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:09.838651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" event={"ID":"10f5468e-4567-428e-b0d6-41836a15fb80","Type":"ContainerDied","Data":"89cdb3e096d64f93d54f9f073edd7cbdc9b4302b88e0741905ed1b3c2856115a"} Apr 22 18:50:09.838790 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:09.838686 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-8694f9ff5b-rnrkw" event={"ID":"10f5468e-4567-428e-b0d6-41836a15fb80","Type":"ContainerStarted","Data":"f30f988deb73fcdf44515f5d24eca3bc9def466c9ff6d20144ae48de7de9a2d0"} Apr 22 18:50:34.077314 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:34.077220 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:50:34.079520 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:34.079499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e28dd910-549e-488c-8e99-3ad3f1d11a5e-metrics-certs\") pod \"network-metrics-daemon-8xjpc\" (UID: \"e28dd910-549e-488c-8e99-3ad3f1d11a5e\") " pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:50:34.303523 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:34.303495 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9rzz8\"" Apr 22 18:50:34.312161 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:34.312139 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8xjpc" Apr 22 18:50:34.425914 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:34.425865 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8xjpc"] Apr 22 18:50:34.428456 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:50:34.428427 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode28dd910_549e_488c_8e99_3ad3f1d11a5e.slice/crio-336f7033f614dcddb8dc983825b125dc0f547e80d31d2770358384b81c5d7292 WatchSource:0}: Error finding container 336f7033f614dcddb8dc983825b125dc0f547e80d31d2770358384b81c5d7292: Status 404 returned error can't find the container with id 336f7033f614dcddb8dc983825b125dc0f547e80d31d2770358384b81c5d7292 Apr 22 18:50:34.908292 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:34.908252 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8xjpc" event={"ID":"e28dd910-549e-488c-8e99-3ad3f1d11a5e","Type":"ContainerStarted","Data":"336f7033f614dcddb8dc983825b125dc0f547e80d31d2770358384b81c5d7292"} Apr 22 18:50:35.912280 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:35.912246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8xjpc" event={"ID":"e28dd910-549e-488c-8e99-3ad3f1d11a5e","Type":"ContainerStarted","Data":"a24c5e7c9d40ab2b52dee45ec1d0807cc95f3acc5c6ad1e88d2f32ffb2cf4712"} Apr 22 18:50:35.912280 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:35.912284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8xjpc" event={"ID":"e28dd910-549e-488c-8e99-3ad3f1d11a5e","Type":"ContainerStarted","Data":"f3d89d76187739d84b32e71012322967109b9ccd1a27e7e336c849f4bf9fc5af"} Apr 22 18:50:35.927639 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:50:35.927593 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8xjpc" podStartSLOduration=253.022470969 podStartE2EDuration="4m13.927579238s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:50:34.430282527 +0000 UTC m=+252.768268346" lastFinishedPulling="2026-04-22 18:50:35.335390793 +0000 UTC m=+253.673376615" observedRunningTime="2026-04-22 18:50:35.926125522 +0000 UTC m=+254.264111363" watchObservedRunningTime="2026-04-22 18:50:35.927579238 +0000 UTC m=+254.265565078" Apr 22 18:51:22.126191 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:51:22.126163 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:52:14.662790 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.662753 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw"] Apr 22 18:52:14.663239 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.662993 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a464150c-2bac-4805-912c-2e3402d480c8" containerName="registry" Apr 22 18:52:14.663239 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.663003 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a464150c-2bac-4805-912c-2e3402d480c8" containerName="registry" Apr 22 18:52:14.663239 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.663043 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a464150c-2bac-4805-912c-2e3402d480c8" containerName="registry" Apr 22 18:52:14.665719 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.665703 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:14.667807 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.667775 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 18:52:14.667807 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.667789 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 18:52:14.667807 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.667805 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-lwb7z\"" Apr 22 18:52:14.668143 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.668106 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 18:52:14.675806 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.675782 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw"] Apr 22 18:52:14.748465 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.748435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1789b6-b667-41c5-9eeb-be62f522283b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw\" (UID: \"0b1789b6-b667-41c5-9eeb-be62f522283b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:14.748642 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.748491 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-755tg\" (UniqueName: \"kubernetes.io/projected/0b1789b6-b667-41c5-9eeb-be62f522283b-kube-api-access-755tg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw\" (UID: \"0b1789b6-b667-41c5-9eeb-be62f522283b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:14.849783 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.849745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-755tg\" (UniqueName: \"kubernetes.io/projected/0b1789b6-b667-41c5-9eeb-be62f522283b-kube-api-access-755tg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw\" (UID: \"0b1789b6-b667-41c5-9eeb-be62f522283b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:14.849974 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.849815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1789b6-b667-41c5-9eeb-be62f522283b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw\" (UID: \"0b1789b6-b667-41c5-9eeb-be62f522283b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:14.852168 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.852149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/0b1789b6-b667-41c5-9eeb-be62f522283b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw\" (UID: \"0b1789b6-b667-41c5-9eeb-be62f522283b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:14.857305 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.857283 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-755tg\" (UniqueName: \"kubernetes.io/projected/0b1789b6-b667-41c5-9eeb-be62f522283b-kube-api-access-755tg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw\" (UID: \"0b1789b6-b667-41c5-9eeb-be62f522283b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:14.976013 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:14.975937 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:15.090575 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:15.090544 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw"] Apr 22 18:52:15.094050 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:52:15.094021 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b1789b6_b667_41c5_9eeb_be62f522283b.slice/crio-5864a947e00c012923f44c896e696a36431d34678e7995289f97b18b24ed029b WatchSource:0}: Error finding container 5864a947e00c012923f44c896e696a36431d34678e7995289f97b18b24ed029b: Status 404 returned error can't find the container with id 5864a947e00c012923f44c896e696a36431d34678e7995289f97b18b24ed029b Apr 22 18:52:15.095756 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:15.095738 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:52:15.163573 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:15.163546 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" event={"ID":"0b1789b6-b667-41c5-9eeb-be62f522283b","Type":"ContainerStarted","Data":"5864a947e00c012923f44c896e696a36431d34678e7995289f97b18b24ed029b"} Apr 22 18:52:19.024500 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.024465 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-w9jss"] Apr 22 18:52:19.027619 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.027600 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.029637 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.029617 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 18:52:19.029766 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.029747 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 18:52:19.029811 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.029803 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-lwngs\"" Apr 22 18:52:19.035615 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.035594 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-w9jss"] Apr 22 18:52:19.080455 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.080430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.080588 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.080460 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrfq\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-kube-api-access-fcrfq\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.080588 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.080498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/25b75a27-81b8-45cc-be2b-4941e3db0817-cabundle0\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.176505 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.176457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" event={"ID":"0b1789b6-b667-41c5-9eeb-be62f522283b","Type":"ContainerStarted","Data":"9b5cc18c9fcaeb9ddad0e8c6ba6cb1adb9985a6663778f6de4a0c5d655fef277"} Apr 22 18:52:19.176682 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.176626 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:19.181360 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.181337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/25b75a27-81b8-45cc-be2b-4941e3db0817-cabundle0\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.181481 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.181388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.181531 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.181476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcrfq\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-kube-api-access-fcrfq\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.181531 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.181481 2572 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 18:52:19.181531 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.181518 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:19.181531 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.181528 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:19.181666 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.181542 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-w9jss: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 18:52:19.181666 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.181608 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates podName:25b75a27-81b8-45cc-be2b-4941e3db0817 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:19.681589558 +0000 UTC m=+358.019575379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates") pod "keda-operator-ffbb595cb-w9jss" (UID: "25b75a27-81b8-45cc-be2b-4941e3db0817") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 18:52:19.181964 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.181946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/25b75a27-81b8-45cc-be2b-4941e3db0817-cabundle0\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.191158 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.191141 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcrfq\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-kube-api-access-fcrfq\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.192752 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.192684 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" podStartSLOduration=1.816650911 podStartE2EDuration="5.192669309s" podCreationTimestamp="2026-04-22 18:52:14 +0000 UTC" firstStartedPulling="2026-04-22 18:52:15.095948962 +0000 UTC m=+353.433934796" lastFinishedPulling="2026-04-22 18:52:18.471967371 +0000 UTC m=+356.809953194" observedRunningTime="2026-04-22 18:52:19.191807272 +0000 UTC m=+357.529793113" watchObservedRunningTime="2026-04-22 18:52:19.192669309 +0000 UTC m=+357.530655152" Apr 22 18:52:19.685188 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:19.685153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:19.685367 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.685278 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:19.685367 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.685289 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:19.685367 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.685298 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-w9jss: references non-existent secret key: ca.crt Apr 22 18:52:19.685367 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:19.685353 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates podName:25b75a27-81b8-45cc-be2b-4941e3db0817 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:20.685340608 +0000 UTC m=+359.023326426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates") pod "keda-operator-ffbb595cb-w9jss" (UID: "25b75a27-81b8-45cc-be2b-4941e3db0817") : references non-existent secret key: ca.crt Apr 22 18:52:20.692187 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:20.692148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:20.692579 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:20.692280 2572 secret.go:281] references non-existent secret key: ca.crt Apr 22 18:52:20.692579 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:20.692299 2572 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 18:52:20.692579 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:20.692308 2572 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-w9jss: references non-existent secret key: ca.crt Apr 22 18:52:20.692579 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:52:20.692368 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates podName:25b75a27-81b8-45cc-be2b-4941e3db0817 nodeName:}" failed. No retries permitted until 2026-04-22 18:52:22.692354485 +0000 UTC m=+361.030340304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates") pod "keda-operator-ffbb595cb-w9jss" (UID: "25b75a27-81b8-45cc-be2b-4941e3db0817") : references non-existent secret key: ca.crt Apr 22 18:52:22.706293 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:22.706254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:22.708693 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:22.708668 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/25b75a27-81b8-45cc-be2b-4941e3db0817-certificates\") pod \"keda-operator-ffbb595cb-w9jss\" (UID: \"25b75a27-81b8-45cc-be2b-4941e3db0817\") " pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:22.939615 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:22.939585 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-lwngs\"" Apr 22 18:52:22.948339 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:22.948302 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:23.062811 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:23.062779 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-w9jss"] Apr 22 18:52:23.065688 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:52:23.065662 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b75a27_81b8_45cc_be2b_4941e3db0817.slice/crio-cc7ecc60812feb57f63ce4dab1b3bede05261b1a98c0589a77bd7dd78ba0d1e7 WatchSource:0}: Error finding container cc7ecc60812feb57f63ce4dab1b3bede05261b1a98c0589a77bd7dd78ba0d1e7: Status 404 returned error can't find the container with id cc7ecc60812feb57f63ce4dab1b3bede05261b1a98c0589a77bd7dd78ba0d1e7 Apr 22 18:52:23.188353 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:23.188315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-w9jss" event={"ID":"25b75a27-81b8-45cc-be2b-4941e3db0817","Type":"ContainerStarted","Data":"cc7ecc60812feb57f63ce4dab1b3bede05261b1a98c0589a77bd7dd78ba0d1e7"} Apr 22 18:52:26.198274 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:26.198187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-w9jss" event={"ID":"25b75a27-81b8-45cc-be2b-4941e3db0817","Type":"ContainerStarted","Data":"065da9b51c05b04dd88c860857059101b3628135d4cec887ba8afbc87eb89a84"} Apr 22 18:52:26.198677 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:26.198307 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:52:26.214918 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:26.214856 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-w9jss" podStartSLOduration=4.460869315 podStartE2EDuration="7.214843345s" podCreationTimestamp="2026-04-22 18:52:19 +0000 UTC" firstStartedPulling="2026-04-22 18:52:23.066883993 +0000 UTC m=+361.404869812" lastFinishedPulling="2026-04-22 18:52:25.820858018 +0000 UTC m=+364.158843842" observedRunningTime="2026-04-22 18:52:26.213455993 +0000 UTC m=+364.551441834" watchObservedRunningTime="2026-04-22 18:52:26.214843345 +0000 UTC m=+364.552829185" Apr 22 18:52:40.182030 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:40.181997 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-vxtjw" Apr 22 18:52:47.203811 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:52:47.203782 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-w9jss" Apr 22 18:53:25.263240 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.263205 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-q2wqq"] Apr 22 18:53:25.266318 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.266299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:25.269123 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.269100 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-jm5vc\"" Apr 22 18:53:25.269243 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.269100 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:53:25.269243 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.269108 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:53:25.269243 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.269177 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:53:25.276336 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.276314 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-q2wqq"] Apr 22 18:53:25.437161 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.437127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5z96\" (UniqueName: \"kubernetes.io/projected/0f2bba80-1bbd-46c4-b857-46543907dc1f-kube-api-access-m5z96\") pod \"kserve-controller-manager-6f655776dd-q2wqq\" (UID: \"0f2bba80-1bbd-46c4-b857-46543907dc1f\") " pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:25.437161 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.437159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2bba80-1bbd-46c4-b857-46543907dc1f-cert\") pod \"kserve-controller-manager-6f655776dd-q2wqq\" (UID: \"0f2bba80-1bbd-46c4-b857-46543907dc1f\") " pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:25.538057 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.538023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5z96\" (UniqueName: \"kubernetes.io/projected/0f2bba80-1bbd-46c4-b857-46543907dc1f-kube-api-access-m5z96\") pod \"kserve-controller-manager-6f655776dd-q2wqq\" (UID: \"0f2bba80-1bbd-46c4-b857-46543907dc1f\") " pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:25.538057 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.538055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2bba80-1bbd-46c4-b857-46543907dc1f-cert\") pod \"kserve-controller-manager-6f655776dd-q2wqq\" (UID: \"0f2bba80-1bbd-46c4-b857-46543907dc1f\") " pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:25.540495 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.540470 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2bba80-1bbd-46c4-b857-46543907dc1f-cert\") pod \"kserve-controller-manager-6f655776dd-q2wqq\" (UID: \"0f2bba80-1bbd-46c4-b857-46543907dc1f\") " pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:25.546997 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.546974 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5z96\" (UniqueName: \"kubernetes.io/projected/0f2bba80-1bbd-46c4-b857-46543907dc1f-kube-api-access-m5z96\") pod \"kserve-controller-manager-6f655776dd-q2wqq\" (UID: \"0f2bba80-1bbd-46c4-b857-46543907dc1f\") " pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:25.577580 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.577553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:25.693744 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:25.693712 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-q2wqq"] Apr 22 18:53:25.697517 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:53:25.697488 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2bba80_1bbd_46c4_b857_46543907dc1f.slice/crio-db788cbf490918caeaf2eb7b0b4c31ad33c68613a2695f41a5a81d4c3b5a137b WatchSource:0}: Error finding container db788cbf490918caeaf2eb7b0b4c31ad33c68613a2695f41a5a81d4c3b5a137b: Status 404 returned error can't find the container with id db788cbf490918caeaf2eb7b0b4c31ad33c68613a2695f41a5a81d4c3b5a137b Apr 22 18:53:26.344915 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:26.344867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" event={"ID":"0f2bba80-1bbd-46c4-b857-46543907dc1f","Type":"ContainerStarted","Data":"db788cbf490918caeaf2eb7b0b4c31ad33c68613a2695f41a5a81d4c3b5a137b"} Apr 22 18:53:29.354871 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:29.354837 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" event={"ID":"0f2bba80-1bbd-46c4-b857-46543907dc1f","Type":"ContainerStarted","Data":"6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492"} Apr 22 18:53:29.355297 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:29.354938 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:53:29.369873 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:53:29.369829 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" podStartSLOduration=1.380721018 podStartE2EDuration="4.36981774s" podCreationTimestamp="2026-04-22 18:53:25 +0000 UTC" firstStartedPulling="2026-04-22 18:53:25.698874812 +0000 UTC m=+424.036860631" lastFinishedPulling="2026-04-22 18:53:28.687971531 +0000 UTC m=+427.025957353" observedRunningTime="2026-04-22 18:53:29.368496525 +0000 UTC m=+427.706482366" watchObservedRunningTime="2026-04-22 18:53:29.36981774 +0000 UTC m=+427.707803581" Apr 22 18:54:00.362368 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:00.362290 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:54:00.718048 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:00.717967 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-q2wqq"] Apr 22 18:54:00.718203 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:00.718186 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" podUID="0f2bba80-1bbd-46c4-b857-46543907dc1f" containerName="manager" containerID="cri-o://6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492" gracePeriod=10 Apr 22 18:54:00.944376 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:00.944354 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:54:00.983965 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:00.983869 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2bba80-1bbd-46c4-b857-46543907dc1f-cert\") pod \"0f2bba80-1bbd-46c4-b857-46543907dc1f\" (UID: \"0f2bba80-1bbd-46c4-b857-46543907dc1f\") " Apr 22 18:54:00.983965 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:00.983956 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5z96\" (UniqueName: \"kubernetes.io/projected/0f2bba80-1bbd-46c4-b857-46543907dc1f-kube-api-access-m5z96\") pod \"0f2bba80-1bbd-46c4-b857-46543907dc1f\" (UID: \"0f2bba80-1bbd-46c4-b857-46543907dc1f\") " Apr 22 18:54:00.985959 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:00.985935 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2bba80-1bbd-46c4-b857-46543907dc1f-kube-api-access-m5z96" (OuterVolumeSpecName: "kube-api-access-m5z96") pod "0f2bba80-1bbd-46c4-b857-46543907dc1f" (UID: "0f2bba80-1bbd-46c4-b857-46543907dc1f"). InnerVolumeSpecName "kube-api-access-m5z96". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:00.986098 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:00.986071 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2bba80-1bbd-46c4-b857-46543907dc1f-cert" (OuterVolumeSpecName: "cert") pod "0f2bba80-1bbd-46c4-b857-46543907dc1f" (UID: "0f2bba80-1bbd-46c4-b857-46543907dc1f"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:54:01.084659 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.084630 2572 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2bba80-1bbd-46c4-b857-46543907dc1f-cert\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:54:01.084659 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.084657 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5z96\" (UniqueName: \"kubernetes.io/projected/0f2bba80-1bbd-46c4-b857-46543907dc1f-kube-api-access-m5z96\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:54:01.442297 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.442262 2572 generic.go:358] "Generic (PLEG): container finished" podID="0f2bba80-1bbd-46c4-b857-46543907dc1f" containerID="6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492" exitCode=0 Apr 22 18:54:01.442721 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.442305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" event={"ID":"0f2bba80-1bbd-46c4-b857-46543907dc1f","Type":"ContainerDied","Data":"6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492"} Apr 22 18:54:01.442721 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.442323 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" Apr 22 18:54:01.442721 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.442343 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-q2wqq" event={"ID":"0f2bba80-1bbd-46c4-b857-46543907dc1f","Type":"ContainerDied","Data":"db788cbf490918caeaf2eb7b0b4c31ad33c68613a2695f41a5a81d4c3b5a137b"} Apr 22 18:54:01.442721 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.442361 2572 scope.go:117] "RemoveContainer" containerID="6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492" Apr 22 18:54:01.450074 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.450017 2572 scope.go:117] "RemoveContainer" containerID="6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492" Apr 22 18:54:01.450416 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:54:01.450388 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492\": container with ID starting with 6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492 not found: ID does not exist" containerID="6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492" Apr 22 18:54:01.450508 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.450417 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492"} err="failed to get container status \"6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492\": rpc error: code = NotFound desc = could not find container \"6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492\": container with ID starting with 6e6256f58d5532904b943d7ccfe59c69d083023df9997aeda062286787580492 not found: ID does not exist" Apr 22 18:54:01.461370 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.461343 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-q2wqq"] Apr 22 18:54:01.464312 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:01.464291 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-q2wqq"] Apr 22 18:54:02.204377 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:02.204341 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2bba80-1bbd-46c4-b857-46543907dc1f" path="/var/lib/kubelet/pods/0f2bba80-1bbd-46c4-b857-46543907dc1f/volumes" Apr 22 18:54:35.685520 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.685487 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-jthj6"] Apr 22 18:54:35.685952 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.685760 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f2bba80-1bbd-46c4-b857-46543907dc1f" containerName="manager" Apr 22 18:54:35.685952 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.685773 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2bba80-1bbd-46c4-b857-46543907dc1f" containerName="manager" Apr 22 18:54:35.685952 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.685834 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f2bba80-1bbd-46c4-b857-46543907dc1f" containerName="manager" Apr 22 18:54:35.688653 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.688636 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:35.690664 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.690636 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:54:35.690869 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.690855 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-442nm\"" Apr 22 18:54:35.690974 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.690853 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:54:35.691417 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.691402 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:54:35.697714 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.697691 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-jthj6"] Apr 22 18:54:35.822539 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.822506 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnd6b\" (UniqueName: \"kubernetes.io/projected/059b92a9-04fa-4655-885e-b791d19ead5b-kube-api-access-qnd6b\") pod \"model-serving-api-86f7b4b499-jthj6\" (UID: \"059b92a9-04fa-4655-885e-b791d19ead5b\") " pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:35.822718 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.822595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/059b92a9-04fa-4655-885e-b791d19ead5b-tls-certs\") pod \"model-serving-api-86f7b4b499-jthj6\" (UID: \"059b92a9-04fa-4655-885e-b791d19ead5b\") " pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:35.923918 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.923869 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnd6b\" (UniqueName: \"kubernetes.io/projected/059b92a9-04fa-4655-885e-b791d19ead5b-kube-api-access-qnd6b\") pod \"model-serving-api-86f7b4b499-jthj6\" (UID: \"059b92a9-04fa-4655-885e-b791d19ead5b\") " pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:35.924077 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.923990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/059b92a9-04fa-4655-885e-b791d19ead5b-tls-certs\") pod \"model-serving-api-86f7b4b499-jthj6\" (UID: \"059b92a9-04fa-4655-885e-b791d19ead5b\") " pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:35.926347 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.926328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/059b92a9-04fa-4655-885e-b791d19ead5b-tls-certs\") pod \"model-serving-api-86f7b4b499-jthj6\" (UID: \"059b92a9-04fa-4655-885e-b791d19ead5b\") " pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:35.931712 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.931686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnd6b\" (UniqueName: \"kubernetes.io/projected/059b92a9-04fa-4655-885e-b791d19ead5b-kube-api-access-qnd6b\") pod \"model-serving-api-86f7b4b499-jthj6\" (UID: \"059b92a9-04fa-4655-885e-b791d19ead5b\") " pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:35.999306 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:35.999242 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:36.116099 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:36.116077 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-jthj6"] Apr 22 18:54:36.119099 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:54:36.119065 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod059b92a9_04fa_4655_885e_b791d19ead5b.slice/crio-2f5da5600da2ed0b29b43cd0ce71d11db092c496a1127dcb3d23dcb493d38a3b WatchSource:0}: Error finding container 2f5da5600da2ed0b29b43cd0ce71d11db092c496a1127dcb3d23dcb493d38a3b: Status 404 returned error can't find the container with id 2f5da5600da2ed0b29b43cd0ce71d11db092c496a1127dcb3d23dcb493d38a3b Apr 22 18:54:36.532517 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:36.532480 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-jthj6" event={"ID":"059b92a9-04fa-4655-885e-b791d19ead5b","Type":"ContainerStarted","Data":"2f5da5600da2ed0b29b43cd0ce71d11db092c496a1127dcb3d23dcb493d38a3b"} Apr 22 18:54:38.539830 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:38.539794 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-jthj6" event={"ID":"059b92a9-04fa-4655-885e-b791d19ead5b","Type":"ContainerStarted","Data":"7ca17cf59078515d9bbcabdc18553db767daa39b3d4c2475641e4c5dbc5690a6"} Apr 22 18:54:38.540189 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:38.539945 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:54:38.554714 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:38.554669 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-jthj6" podStartSLOduration=1.211155585 podStartE2EDuration="3.554655335s" podCreationTimestamp="2026-04-22 18:54:35 +0000 UTC" firstStartedPulling="2026-04-22 18:54:36.12071655 +0000 UTC m=+494.458702368" lastFinishedPulling="2026-04-22 18:54:38.46421629 +0000 UTC m=+496.802202118" observedRunningTime="2026-04-22 18:54:38.553744652 +0000 UTC m=+496.891730493" watchObservedRunningTime="2026-04-22 18:54:38.554655335 +0000 UTC m=+496.892641192" Apr 22 18:54:49.546558 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:54:49.546530 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-jthj6" Apr 22 18:55:13.212287 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.212255 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql"] Apr 22 18:55:13.215217 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.215200 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.217122 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.217103 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:55:13.217358 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.217340 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:55:13.218016 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.218001 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-n6c6z\"" Apr 22 18:55:13.218079 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.218025 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9844b-kube-rbac-proxy-sar-config\"" Apr 22 18:55:13.218116 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.218025 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-9844b-predictor-serving-cert\"" Apr 22 18:55:13.224027 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.224004 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql"] Apr 22 18:55:13.393840 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.393806 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ef84b19-d69a-4d3e-8917-b80a93056b4b-success-200-isvc-9844b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.394031 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.393944 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hgzq\" (UniqueName: \"kubernetes.io/projected/4ef84b19-d69a-4d3e-8917-b80a93056b4b-kube-api-access-9hgzq\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.394107 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.394048 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ef84b19-d69a-4d3e-8917-b80a93056b4b-proxy-tls\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.495206 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.495116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ef84b19-d69a-4d3e-8917-b80a93056b4b-success-200-isvc-9844b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.495206 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.495175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hgzq\" (UniqueName: \"kubernetes.io/projected/4ef84b19-d69a-4d3e-8917-b80a93056b4b-kube-api-access-9hgzq\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.495421 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.495210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ef84b19-d69a-4d3e-8917-b80a93056b4b-proxy-tls\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.495782 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.495756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ef84b19-d69a-4d3e-8917-b80a93056b4b-success-200-isvc-9844b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.497687 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.497666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ef84b19-d69a-4d3e-8917-b80a93056b4b-proxy-tls\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.503980 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.503959 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hgzq\" (UniqueName: \"kubernetes.io/projected/4ef84b19-d69a-4d3e-8917-b80a93056b4b-kube-api-access-9hgzq\") pod \"success-200-isvc-9844b-predictor-758ffd9676-tf4ql\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.525422 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.525397 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:13.642161 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:13.642110 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql"] Apr 22 18:55:13.647098 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:55:13.647065 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef84b19_d69a_4d3e_8917_b80a93056b4b.slice/crio-721a1705264dc4aeb88569f7e7ff9886ebba2f92d5aa88fa503b74a2b381454f WatchSource:0}: Error finding container 721a1705264dc4aeb88569f7e7ff9886ebba2f92d5aa88fa503b74a2b381454f: Status 404 returned error can't find the container with id 721a1705264dc4aeb88569f7e7ff9886ebba2f92d5aa88fa503b74a2b381454f Apr 22 18:55:14.478154 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.476636 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs"] Apr 22 18:55:14.480271 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.480247 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.485210 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.484559 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 22 18:55:14.485210 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.484810 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 22 18:55:14.490054 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.489979 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs"] Apr 22 18:55:14.602957 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.602692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.602957 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.602742 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p25z7\" (UniqueName: \"kubernetes.io/projected/c86cd209-eb15-443e-8603-47d62cdea78b-kube-api-access-p25z7\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.602957 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.602807 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c86cd209-eb15-443e-8603-47d62cdea78b-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.602957 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.602845 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86cd209-eb15-443e-8603-47d62cdea78b-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.638877 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.638798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" event={"ID":"4ef84b19-d69a-4d3e-8917-b80a93056b4b","Type":"ContainerStarted","Data":"721a1705264dc4aeb88569f7e7ff9886ebba2f92d5aa88fa503b74a2b381454f"} Apr 22 18:55:14.703528 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.703488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c86cd209-eb15-443e-8603-47d62cdea78b-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.703724 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.703551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86cd209-eb15-443e-8603-47d62cdea78b-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.703724 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.703596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.703724 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.703622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p25z7\" (UniqueName: \"kubernetes.io/projected/c86cd209-eb15-443e-8603-47d62cdea78b-kube-api-access-p25z7\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.704185 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:55:14.704160 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-serving-cert: secret "isvc-sklearn-graph-2-predictor-serving-cert" not found Apr 22 18:55:14.704260 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:55:14.704239 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls podName:c86cd209-eb15-443e-8603-47d62cdea78b nodeName:}" failed. No retries permitted until 2026-04-22 18:55:15.204217966 +0000 UTC m=+533.542203787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls") pod "isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" (UID: "c86cd209-eb15-443e-8603-47d62cdea78b") : secret "isvc-sklearn-graph-2-predictor-serving-cert" not found Apr 22 18:55:14.704305 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.704261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86cd209-eb15-443e-8603-47d62cdea78b-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.704305 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.704285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c86cd209-eb15-443e-8603-47d62cdea78b-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:14.713933 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:14.713862 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p25z7\" (UniqueName: \"kubernetes.io/projected/c86cd209-eb15-443e-8603-47d62cdea78b-kube-api-access-p25z7\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:15.208009 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:15.207930 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:15.220968 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:15.219967 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:15.402187 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:15.402143 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:15.582718 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:15.580909 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs"] Apr 22 18:55:15.586350 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:55:15.586043 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86cd209_eb15_443e_8603_47d62cdea78b.slice/crio-f40447fcf833e4877eaafeaa1ad82c0a92de44128a9f5aebe0948cb7c7f36091 WatchSource:0}: Error finding container f40447fcf833e4877eaafeaa1ad82c0a92de44128a9f5aebe0948cb7c7f36091: Status 404 returned error can't find the container with id f40447fcf833e4877eaafeaa1ad82c0a92de44128a9f5aebe0948cb7c7f36091 Apr 22 18:55:15.644215 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:15.644143 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" event={"ID":"c86cd209-eb15-443e-8603-47d62cdea78b","Type":"ContainerStarted","Data":"f40447fcf833e4877eaafeaa1ad82c0a92de44128a9f5aebe0948cb7c7f36091"} Apr 22 18:55:26.678913 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:26.678867 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" event={"ID":"4ef84b19-d69a-4d3e-8917-b80a93056b4b","Type":"ContainerStarted","Data":"a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047"} Apr 22 18:55:26.680235 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:26.680202 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" event={"ID":"c86cd209-eb15-443e-8603-47d62cdea78b","Type":"ContainerStarted","Data":"6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1"} Apr 22 18:55:29.692358 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:29.692321 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" event={"ID":"4ef84b19-d69a-4d3e-8917-b80a93056b4b","Type":"ContainerStarted","Data":"1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e"} Apr 22 18:55:29.692831 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:29.692503 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:29.709621 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:29.709572 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podStartSLOduration=1.662257623 podStartE2EDuration="16.709559979s" podCreationTimestamp="2026-04-22 18:55:13 +0000 UTC" firstStartedPulling="2026-04-22 18:55:13.649266245 +0000 UTC m=+531.987252064" lastFinishedPulling="2026-04-22 18:55:28.696568592 +0000 UTC m=+547.034554420" observedRunningTime="2026-04-22 18:55:29.708000296 +0000 UTC m=+548.045986141" watchObservedRunningTime="2026-04-22 18:55:29.709559979 +0000 UTC m=+548.047545850" Apr 22 18:55:30.696989 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:30.696948 2572 generic.go:358] "Generic (PLEG): container finished" podID="c86cd209-eb15-443e-8603-47d62cdea78b" containerID="6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1" exitCode=0 Apr 22 18:55:30.697410 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:30.697021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" event={"ID":"c86cd209-eb15-443e-8603-47d62cdea78b","Type":"ContainerDied","Data":"6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1"} Apr 22 18:55:30.697532 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:30.697512 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:30.698836 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:30.698812 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 18:55:31.700111 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:31.700073 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 18:55:36.706319 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:36.706069 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:55:36.706764 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:36.706610 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 18:55:38.725878 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:38.725841 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" event={"ID":"c86cd209-eb15-443e-8603-47d62cdea78b","Type":"ContainerStarted","Data":"f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4"} Apr 22 18:55:38.725878 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:38.725880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" event={"ID":"c86cd209-eb15-443e-8603-47d62cdea78b","Type":"ContainerStarted","Data":"18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a"} Apr 22 18:55:38.726300 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:38.726173 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:38.726339 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:38.726300 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:38.727381 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:38.727356 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:55:38.744389 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:38.744331 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podStartSLOduration=2.281579289 podStartE2EDuration="24.744313361s" podCreationTimestamp="2026-04-22 18:55:14 +0000 UTC" firstStartedPulling="2026-04-22 18:55:15.589662361 +0000 UTC m=+533.927648181" lastFinishedPulling="2026-04-22 18:55:38.052396433 +0000 UTC m=+556.390382253" observedRunningTime="2026-04-22 18:55:38.742618284 +0000 UTC m=+557.080604125" watchObservedRunningTime="2026-04-22 18:55:38.744313361 +0000 UTC m=+557.082299206" Apr 22 18:55:39.728917 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:39.728858 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:55:44.733395 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:44.733364 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:55:44.734093 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:44.734058 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:55:46.707181 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:46.707142 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 18:55:54.734957 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:54.734883 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:55:56.707456 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:55:56.707420 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 18:56:04.734487 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:04.734440 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:56:06.706871 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:06.706832 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.17:8080: connect: connection refused" Apr 22 18:56:14.734471 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:14.734430 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:56:16.708018 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:16.707990 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:56:24.734169 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:24.734122 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:56:34.734295 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:34.734205 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:56:44.735056 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:44.735026 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:56:47.544533 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.544499 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql"] Apr 22 18:56:47.544996 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.544771 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" containerID="cri-o://a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047" gracePeriod=30 Apr 22 18:56:47.544996 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.544844 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kube-rbac-proxy" containerID="cri-o://1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e" gracePeriod=30 Apr 22 18:56:47.606963 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.606931 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg"] Apr 22 18:56:47.610248 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.610229 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.612191 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.612174 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-70f3d-predictor-serving-cert\"" Apr 22 18:56:47.612191 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.612184 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-70f3d-kube-rbac-proxy-sar-config\"" Apr 22 18:56:47.619079 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.619056 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg"] Apr 22 18:56:47.705878 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.705847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.706029 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.705885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqcks\" (UniqueName: \"kubernetes.io/projected/ea3777a4-31a2-460c-b453-d828f465a603-kube-api-access-rqcks\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.706029 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.706017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3777a4-31a2-460c-b453-d828f465a603-success-200-isvc-70f3d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.806704 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.806635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.806704 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.806672 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqcks\" (UniqueName: \"kubernetes.io/projected/ea3777a4-31a2-460c-b453-d828f465a603-kube-api-access-rqcks\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.806858 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.806719 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3777a4-31a2-460c-b453-d828f465a603-success-200-isvc-70f3d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.806858 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:56:47.806797 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-serving-cert: secret "success-200-isvc-70f3d-predictor-serving-cert" not found Apr 22 18:56:47.806989 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:56:47.806882 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls podName:ea3777a4-31a2-460c-b453-d828f465a603 nodeName:}" failed. No retries permitted until 2026-04-22 18:56:48.306859959 +0000 UTC m=+626.644845777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls") pod "success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" (UID: "ea3777a4-31a2-460c-b453-d828f465a603") : secret "success-200-isvc-70f3d-predictor-serving-cert" not found Apr 22 18:56:47.807410 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.807391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3777a4-31a2-460c-b453-d828f465a603-success-200-isvc-70f3d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.817183 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.817152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqcks\" (UniqueName: \"kubernetes.io/projected/ea3777a4-31a2-460c-b453-d828f465a603-kube-api-access-rqcks\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:47.910864 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.910828 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerID="1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e" exitCode=2 Apr 22 18:56:47.911021 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:47.910893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" event={"ID":"4ef84b19-d69a-4d3e-8917-b80a93056b4b","Type":"ContainerDied","Data":"1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e"} Apr 22 18:56:48.310158 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:48.310130 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:48.312442 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:48.312424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls\") pod \"success-200-isvc-70f3d-predictor-f4754d89d-rtwtg\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:48.520228 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:48.520193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:48.839757 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:48.839368 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg"] Apr 22 18:56:48.842266 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:56:48.842232 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea3777a4_31a2_460c_b453_d828f465a603.slice/crio-26102093c5bde4860298a7dd052f461dd65deae87c86ba4b778c6aa84d039ac1 WatchSource:0}: Error finding container 26102093c5bde4860298a7dd052f461dd65deae87c86ba4b778c6aa84d039ac1: Status 404 returned error can't find the container with id 26102093c5bde4860298a7dd052f461dd65deae87c86ba4b778c6aa84d039ac1 Apr 22 18:56:48.918202 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:48.918172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" event={"ID":"ea3777a4-31a2-460c-b453-d828f465a603","Type":"ContainerStarted","Data":"c341cf310944cd99903df05802ecf7490a590f4cfde7a1443cf58d5feda21f5d"} Apr 22 18:56:48.918300 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:48.918210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" event={"ID":"ea3777a4-31a2-460c-b453-d828f465a603","Type":"ContainerStarted","Data":"26102093c5bde4860298a7dd052f461dd65deae87c86ba4b778c6aa84d039ac1"} Apr 22 18:56:49.922133 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:49.922098 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" event={"ID":"ea3777a4-31a2-460c-b453-d828f465a603","Type":"ContainerStarted","Data":"db86617c6fe3315d72e7af914a01adbc2b88063b85481e020a2c25fadbb80a03"} Apr 22 18:56:49.922663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:49.922376 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:49.922663 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:49.922415 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:49.923660 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:49.923633 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 18:56:49.938247 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:49.938194 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podStartSLOduration=2.9381804000000002 podStartE2EDuration="2.9381804s" podCreationTimestamp="2026-04-22 18:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:56:49.936294894 +0000 UTC m=+628.274280735" watchObservedRunningTime="2026-04-22 18:56:49.9381804 +0000 UTC m=+628.276166240" Apr 22 18:56:50.480579 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.480554 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:56:50.528551 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.528525 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ef84b19-d69a-4d3e-8917-b80a93056b4b-success-200-isvc-9844b-kube-rbac-proxy-sar-config\") pod \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " Apr 22 18:56:50.528704 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.528620 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ef84b19-d69a-4d3e-8917-b80a93056b4b-proxy-tls\") pod \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " Apr 22 18:56:50.528704 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.528652 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hgzq\" (UniqueName: \"kubernetes.io/projected/4ef84b19-d69a-4d3e-8917-b80a93056b4b-kube-api-access-9hgzq\") pod \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\" (UID: \"4ef84b19-d69a-4d3e-8917-b80a93056b4b\") " Apr 22 18:56:50.528861 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.528838 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef84b19-d69a-4d3e-8917-b80a93056b4b-success-200-isvc-9844b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-9844b-kube-rbac-proxy-sar-config") pod "4ef84b19-d69a-4d3e-8917-b80a93056b4b" (UID: "4ef84b19-d69a-4d3e-8917-b80a93056b4b"). InnerVolumeSpecName "success-200-isvc-9844b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:56:50.530784 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.530748 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef84b19-d69a-4d3e-8917-b80a93056b4b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ef84b19-d69a-4d3e-8917-b80a93056b4b" (UID: "4ef84b19-d69a-4d3e-8917-b80a93056b4b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:50.530784 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.530757 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef84b19-d69a-4d3e-8917-b80a93056b4b-kube-api-access-9hgzq" (OuterVolumeSpecName: "kube-api-access-9hgzq") pod "4ef84b19-d69a-4d3e-8917-b80a93056b4b" (UID: "4ef84b19-d69a-4d3e-8917-b80a93056b4b"). InnerVolumeSpecName "kube-api-access-9hgzq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:50.629697 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.629627 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ef84b19-d69a-4d3e-8917-b80a93056b4b-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:50.629697 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.629656 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9hgzq\" (UniqueName: \"kubernetes.io/projected/4ef84b19-d69a-4d3e-8917-b80a93056b4b-kube-api-access-9hgzq\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:50.629697 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.629667 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-9844b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ef84b19-d69a-4d3e-8917-b80a93056b4b-success-200-isvc-9844b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:56:50.926109 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.926031 2572 generic.go:358] "Generic (PLEG): container finished" podID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerID="a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047" exitCode=0 Apr 22 18:56:50.926109 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.926099 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" Apr 22 18:56:50.926548 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.926117 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" event={"ID":"4ef84b19-d69a-4d3e-8917-b80a93056b4b","Type":"ContainerDied","Data":"a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047"} Apr 22 18:56:50.926548 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.926153 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql" event={"ID":"4ef84b19-d69a-4d3e-8917-b80a93056b4b","Type":"ContainerDied","Data":"721a1705264dc4aeb88569f7e7ff9886ebba2f92d5aa88fa503b74a2b381454f"} Apr 22 18:56:50.926548 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.926170 2572 scope.go:117] "RemoveContainer" containerID="1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e" Apr 22 18:56:50.926807 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.926774 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 18:56:50.934357 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.934340 2572 scope.go:117] "RemoveContainer" containerID="a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047" Apr 22 18:56:50.941437 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.941419 2572 scope.go:117] "RemoveContainer" containerID="1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e" Apr 22 18:56:50.941674 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:56:50.941657 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e\": container with ID starting with 1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e not found: ID does not exist" containerID="1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e" Apr 22 18:56:50.941742 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.941687 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e"} err="failed to get container status \"1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e\": rpc error: code = NotFound desc = could not find container \"1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e\": container with ID starting with 1848094ad6200b3b7b69be34f11ba5194618704ff58314ea480f1491cf1a343e not found: ID does not exist" Apr 22 18:56:50.941742 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.941713 2572 scope.go:117] "RemoveContainer" containerID="a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047" Apr 22 18:56:50.942013 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:56:50.941995 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047\": container with ID starting with a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047 not found: ID does not exist" containerID="a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047" Apr 22 18:56:50.942063 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.942021 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047"} err="failed to get container status \"a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047\": rpc error: code = NotFound desc = could not find container \"a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047\": container with ID starting with a764ca1881471a083c0ffd1c24b9986a770b71ee8c2b14212d0b3639b70db047 not found: ID does not exist" Apr 22 18:56:50.946148 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.946126 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql"] Apr 22 18:56:50.947868 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:50.947848 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-9844b-predictor-758ffd9676-tf4ql"] Apr 22 18:56:52.205032 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:52.205000 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" path="/var/lib/kubelet/pods/4ef84b19-d69a-4d3e-8917-b80a93056b4b/volumes" Apr 22 18:56:55.930858 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:55.930831 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:56:55.931361 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:56:55.931336 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 18:57:05.931534 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:05.931492 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 18:57:15.931710 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:15.931670 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 18:57:23.439190 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.439119 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs"] Apr 22 18:57:23.439665 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.439630 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" containerID="cri-o://18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a" gracePeriod=30 Apr 22 18:57:23.439811 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.439786 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kube-rbac-proxy" containerID="cri-o://f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4" gracePeriod=30 Apr 22 18:57:23.489227 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.489200 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt"] Apr 22 18:57:23.489534 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.489521 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" Apr 22 18:57:23.489579 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.489536 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" Apr 22 18:57:23.489579 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.489559 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kube-rbac-proxy" Apr 22 18:57:23.489579 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.489567 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kube-rbac-proxy" Apr 22 18:57:23.489667 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.489635 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kube-rbac-proxy" Apr 22 18:57:23.489667 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.489650 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ef84b19-d69a-4d3e-8917-b80a93056b4b" containerName="kserve-container" Apr 22 18:57:23.493560 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.493544 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:23.495365 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.495345 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-0e291-predictor-serving-cert\"" Apr 22 18:57:23.495365 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.495359 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-0e291-kube-rbac-proxy-sar-config\"" Apr 22 18:57:23.503253 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.503233 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt"] Apr 22 18:57:23.572521 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.572494 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:23.572642 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.572562 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-success-200-isvc-0e291-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:23.572642 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.572598 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht5pv\" (UniqueName: \"kubernetes.io/projected/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-kube-api-access-ht5pv\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:23.673466 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.673430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-success-200-isvc-0e291-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:23.673619 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.673477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht5pv\" (UniqueName: \"kubernetes.io/projected/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-kube-api-access-ht5pv\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:23.673619 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.673596 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:23.673787 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:57:23.673766 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-0e291-predictor-serving-cert: secret "success-200-isvc-0e291-predictor-serving-cert" not found Apr 22 18:57:23.673857 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:57:23.673846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls podName:0a7e6e64-c5d9-42b0-9de6-4880b17a1529 nodeName:}" failed. No retries permitted until 2026-04-22 18:57:24.173823483 +0000 UTC m=+662.511809315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls") pod "success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" (UID: "0a7e6e64-c5d9-42b0-9de6-4880b17a1529") : secret "success-200-isvc-0e291-predictor-serving-cert" not found Apr 22 18:57:23.674134 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.674115 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-success-200-isvc-0e291-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:23.684484 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:23.684456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht5pv\" (UniqueName: \"kubernetes.io/projected/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-kube-api-access-ht5pv\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:24.016715 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.016682 2572 generic.go:358] "Generic (PLEG): container finished" podID="c86cd209-eb15-443e-8603-47d62cdea78b" containerID="f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4" exitCode=2 Apr 22 18:57:24.016875 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.016751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" event={"ID":"c86cd209-eb15-443e-8603-47d62cdea78b","Type":"ContainerDied","Data":"f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4"} Apr 22 18:57:24.177006 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.176976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:24.179259 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.179230 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls\") pod \"success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:24.403704 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.403670 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:24.520985 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.520631 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt"] Apr 22 18:57:24.523167 ip-10-0-138-84 kubenswrapper[2572]: W0422 18:57:24.523137 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a7e6e64_c5d9_42b0_9de6_4880b17a1529.slice/crio-413c12a054f76dfa2f2789717e9c20a461e75af621cf7261ebafa9d54d2b8140 WatchSource:0}: Error finding container 413c12a054f76dfa2f2789717e9c20a461e75af621cf7261ebafa9d54d2b8140: Status 404 returned error can't find the container with id 413c12a054f76dfa2f2789717e9c20a461e75af621cf7261ebafa9d54d2b8140 Apr 22 18:57:24.524861 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.524844 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:57:24.729167 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.729078 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.18:8643/healthz\": dial tcp 10.132.0.18:8643: connect: connection refused" Apr 22 18:57:24.734453 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:24.734409 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.18:8080: connect: connection refused" Apr 22 18:57:25.020742 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:25.020656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" event={"ID":"0a7e6e64-c5d9-42b0-9de6-4880b17a1529","Type":"ContainerStarted","Data":"f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c"} Apr 22 18:57:25.020742 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:25.020694 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" event={"ID":"0a7e6e64-c5d9-42b0-9de6-4880b17a1529","Type":"ContainerStarted","Data":"6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84"} Apr 22 18:57:25.020742 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:25.020707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" event={"ID":"0a7e6e64-c5d9-42b0-9de6-4880b17a1529","Type":"ContainerStarted","Data":"413c12a054f76dfa2f2789717e9c20a461e75af621cf7261ebafa9d54d2b8140"} Apr 22 18:57:25.021029 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:25.020827 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:25.035841 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:25.035643 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podStartSLOduration=2.035627776 podStartE2EDuration="2.035627776s" podCreationTimestamp="2026-04-22 18:57:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:25.034505409 +0000 UTC m=+663.372491256" watchObservedRunningTime="2026-04-22 18:57:25.035627776 +0000 UTC m=+663.373613618" Apr 22 18:57:25.931913 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:25.931858 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.19:8080: connect: connection refused" Apr 22 18:57:26.023537 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:26.023502 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:26.024513 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:26.024485 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 18:57:27.025893 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.025857 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 18:57:27.581302 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.581278 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:57:27.704153 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.704079 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls\") pod \"c86cd209-eb15-443e-8603-47d62cdea78b\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " Apr 22 18:57:27.704153 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.704125 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86cd209-eb15-443e-8603-47d62cdea78b-kserve-provision-location\") pod \"c86cd209-eb15-443e-8603-47d62cdea78b\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " Apr 22 18:57:27.704153 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.704152 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p25z7\" (UniqueName: \"kubernetes.io/projected/c86cd209-eb15-443e-8603-47d62cdea78b-kube-api-access-p25z7\") pod \"c86cd209-eb15-443e-8603-47d62cdea78b\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " Apr 22 18:57:27.704386 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.704201 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c86cd209-eb15-443e-8603-47d62cdea78b-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"c86cd209-eb15-443e-8603-47d62cdea78b\" (UID: \"c86cd209-eb15-443e-8603-47d62cdea78b\") " Apr 22 18:57:27.704557 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.704524 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86cd209-eb15-443e-8603-47d62cdea78b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c86cd209-eb15-443e-8603-47d62cdea78b" (UID: "c86cd209-eb15-443e-8603-47d62cdea78b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:27.704658 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.704623 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c86cd209-eb15-443e-8603-47d62cdea78b-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "c86cd209-eb15-443e-8603-47d62cdea78b" (UID: "c86cd209-eb15-443e-8603-47d62cdea78b"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:57:27.706256 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.706234 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86cd209-eb15-443e-8603-47d62cdea78b-kube-api-access-p25z7" (OuterVolumeSpecName: "kube-api-access-p25z7") pod "c86cd209-eb15-443e-8603-47d62cdea78b" (UID: "c86cd209-eb15-443e-8603-47d62cdea78b"). InnerVolumeSpecName "kube-api-access-p25z7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:27.706363 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.706237 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c86cd209-eb15-443e-8603-47d62cdea78b" (UID: "c86cd209-eb15-443e-8603-47d62cdea78b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:27.805382 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.805353 2572 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c86cd209-eb15-443e-8603-47d62cdea78b-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:57:27.805382 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.805379 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c86cd209-eb15-443e-8603-47d62cdea78b-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:57:27.805537 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.805394 2572 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86cd209-eb15-443e-8603-47d62cdea78b-kserve-provision-location\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:57:27.805537 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:27.805406 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p25z7\" (UniqueName: \"kubernetes.io/projected/c86cd209-eb15-443e-8603-47d62cdea78b-kube-api-access-p25z7\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 18:57:28.029789 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.029759 2572 generic.go:358] "Generic (PLEG): container finished" podID="c86cd209-eb15-443e-8603-47d62cdea78b" containerID="18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a" exitCode=0 Apr 22 18:57:28.030231 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.029842 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" event={"ID":"c86cd209-eb15-443e-8603-47d62cdea78b","Type":"ContainerDied","Data":"18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a"} Apr 22 18:57:28.030231 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.029857 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" Apr 22 18:57:28.030231 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.029877 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs" event={"ID":"c86cd209-eb15-443e-8603-47d62cdea78b","Type":"ContainerDied","Data":"f40447fcf833e4877eaafeaa1ad82c0a92de44128a9f5aebe0948cb7c7f36091"} Apr 22 18:57:28.030231 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.029893 2572 scope.go:117] "RemoveContainer" containerID="f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4" Apr 22 18:57:28.038747 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.038732 2572 scope.go:117] "RemoveContainer" containerID="18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a" Apr 22 18:57:28.045929 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.045913 2572 scope.go:117] "RemoveContainer" containerID="6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1" Apr 22 18:57:28.049586 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.049565 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs"] Apr 22 18:57:28.055077 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.055050 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-847f6cf74b-q6frs"] Apr 22 18:57:28.056034 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.056019 2572 scope.go:117] "RemoveContainer" containerID="f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4" Apr 22 18:57:28.056314 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:57:28.056296 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4\": container with ID starting with f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4 not found: ID does not exist" containerID="f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4" Apr 22 18:57:28.056394 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.056334 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4"} err="failed to get container status \"f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4\": rpc error: code = NotFound desc = could not find container \"f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4\": container with ID starting with f5bd53bc92e9ba3e5a5c727983b59e136d8cd40e2146e9bfa7587744334993b4 not found: ID does not exist" Apr 22 18:57:28.056394 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.056359 2572 scope.go:117] "RemoveContainer" containerID="18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a" Apr 22 18:57:28.056612 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:57:28.056595 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a\": container with ID starting with 18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a not found: ID does not exist" containerID="18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a" Apr 22 18:57:28.056656 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.056618 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a"} err="failed to get container status \"18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a\": rpc error: code = NotFound desc = could not find container \"18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a\": container with ID starting with 18cf6f0028ab2f24e41505b0e8473c0515ac65293269429116862871e8afc00a not found: ID does not exist" Apr 22 18:57:28.056656 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.056634 2572 scope.go:117] "RemoveContainer" containerID="6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1" Apr 22 18:57:28.056869 ip-10-0-138-84 kubenswrapper[2572]: E0422 18:57:28.056850 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1\": container with ID starting with 6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1 not found: ID does not exist" containerID="6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1" Apr 22 18:57:28.056938 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.056878 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1"} err="failed to get container status \"6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1\": rpc error: code = NotFound desc = could not find container \"6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1\": container with ID starting with 6c83c6f369a928da21bfba13af7629a2e8b0e269a7190f085ff2d89a72cffea1 not found: ID does not exist" Apr 22 18:57:28.204638 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:28.204609 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" path="/var/lib/kubelet/pods/c86cd209-eb15-443e-8603-47d62cdea78b/volumes" Apr 22 18:57:32.031020 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:32.030989 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 18:57:32.031565 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:32.031533 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 18:57:35.932079 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:35.932051 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 18:57:42.031584 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:42.031543 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 18:57:52.031662 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:57:52.031622 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 18:58:02.031727 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:58:02.031690 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.20:8080: connect: connection refused" Apr 22 18:58:12.032543 ip-10-0-138-84 kubenswrapper[2572]: I0422 18:58:12.032513 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 19:06:02.477069 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.476990 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg"] Apr 22 19:06:02.479592 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.477279 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" containerID="cri-o://c341cf310944cd99903df05802ecf7490a590f4cfde7a1443cf58d5feda21f5d" gracePeriod=30 Apr 22 19:06:02.479592 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.477325 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kube-rbac-proxy" containerID="cri-o://db86617c6fe3315d72e7af914a01adbc2b88063b85481e020a2c25fadbb80a03" gracePeriod=30 Apr 22 19:06:02.524763 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.524734 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj"] Apr 22 19:06:02.525144 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.525125 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kube-rbac-proxy" Apr 22 19:06:02.525235 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.525148 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kube-rbac-proxy" Apr 22 19:06:02.525235 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.525176 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" Apr 22 19:06:02.525235 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.525185 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" Apr 22 19:06:02.525235 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.525201 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="storage-initializer" Apr 22 19:06:02.525235 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.525210 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="storage-initializer" Apr 22 19:06:02.525478 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.525289 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kserve-container" Apr 22 19:06:02.525478 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.525300 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c86cd209-eb15-443e-8603-47d62cdea78b" containerName="kube-rbac-proxy" Apr 22 19:06:02.528553 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.528533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.530357 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.530332 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-50a5e-predictor-serving-cert\"" Apr 22 19:06:02.530849 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.530829 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-50a5e-kube-rbac-proxy-sar-config\"" Apr 22 19:06:02.542175 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.542153 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj"] Apr 22 19:06:02.629257 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.629229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f80fa95-499a-400e-bb2b-8590ce422211-proxy-tls\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.629414 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.629267 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f80fa95-499a-400e-bb2b-8590ce422211-success-200-isvc-50a5e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.629414 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.629298 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkvx4\" (UniqueName: \"kubernetes.io/projected/7f80fa95-499a-400e-bb2b-8590ce422211-kube-api-access-fkvx4\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.730366 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.730290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f80fa95-499a-400e-bb2b-8590ce422211-proxy-tls\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.730366 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.730333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f80fa95-499a-400e-bb2b-8590ce422211-success-200-isvc-50a5e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.730366 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.730354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkvx4\" (UniqueName: \"kubernetes.io/projected/7f80fa95-499a-400e-bb2b-8590ce422211-kube-api-access-fkvx4\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.730931 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.730884 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f80fa95-499a-400e-bb2b-8590ce422211-success-200-isvc-50a5e-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.732643 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.732625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f80fa95-499a-400e-bb2b-8590ce422211-proxy-tls\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.739443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.739420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkvx4\" (UniqueName: \"kubernetes.io/projected/7f80fa95-499a-400e-bb2b-8590ce422211-kube-api-access-fkvx4\") pod \"success-200-isvc-50a5e-predictor-6b875d9454-7mxtj\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.839178 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.839152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:02.956500 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.956469 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj"] Apr 22 19:06:02.959491 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:06:02.959460 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f80fa95_499a_400e_bb2b_8590ce422211.slice/crio-feea9c2ce221a550221dfd17120ca7bc4f4553783958f449b0cb4e9400020ae9 WatchSource:0}: Error finding container feea9c2ce221a550221dfd17120ca7bc4f4553783958f449b0cb4e9400020ae9: Status 404 returned error can't find the container with id feea9c2ce221a550221dfd17120ca7bc4f4553783958f449b0cb4e9400020ae9 Apr 22 19:06:02.961058 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:02.961039 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:06:03.383573 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:03.383543 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea3777a4-31a2-460c-b453-d828f465a603" containerID="db86617c6fe3315d72e7af914a01adbc2b88063b85481e020a2c25fadbb80a03" exitCode=2 Apr 22 19:06:03.383745 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:03.383607 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" event={"ID":"ea3777a4-31a2-460c-b453-d828f465a603","Type":"ContainerDied","Data":"db86617c6fe3315d72e7af914a01adbc2b88063b85481e020a2c25fadbb80a03"} Apr 22 19:06:03.385008 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:03.384976 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" event={"ID":"7f80fa95-499a-400e-bb2b-8590ce422211","Type":"ContainerStarted","Data":"57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e"} Apr 22 19:06:03.385008 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:03.385006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" event={"ID":"7f80fa95-499a-400e-bb2b-8590ce422211","Type":"ContainerStarted","Data":"a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3"} Apr 22 19:06:03.385171 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:03.385021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" event={"ID":"7f80fa95-499a-400e-bb2b-8590ce422211","Type":"ContainerStarted","Data":"feea9c2ce221a550221dfd17120ca7bc4f4553783958f449b0cb4e9400020ae9"} Apr 22 19:06:03.385171 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:03.385140 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:03.401610 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:03.401574 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podStartSLOduration=1.401562274 podStartE2EDuration="1.401562274s" podCreationTimestamp="2026-04-22 19:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:03.399722092 +0000 UTC m=+1181.737707932" watchObservedRunningTime="2026-04-22 19:06:03.401562274 +0000 UTC m=+1181.739548115" Apr 22 19:06:04.387948 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:04.387890 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:04.389191 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:04.389162 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 19:06:05.391762 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.391730 2572 generic.go:358] "Generic (PLEG): container finished" podID="ea3777a4-31a2-460c-b453-d828f465a603" containerID="c341cf310944cd99903df05802ecf7490a590f4cfde7a1443cf58d5feda21f5d" exitCode=0 Apr 22 19:06:05.392110 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.391801 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" event={"ID":"ea3777a4-31a2-460c-b453-d828f465a603","Type":"ContainerDied","Data":"c341cf310944cd99903df05802ecf7490a590f4cfde7a1443cf58d5feda21f5d"} Apr 22 19:06:05.392239 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.392218 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 19:06:05.815293 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.815265 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 19:06:05.852134 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.852100 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls\") pod \"ea3777a4-31a2-460c-b453-d828f465a603\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " Apr 22 19:06:05.852273 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.852157 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3777a4-31a2-460c-b453-d828f465a603-success-200-isvc-70f3d-kube-rbac-proxy-sar-config\") pod \"ea3777a4-31a2-460c-b453-d828f465a603\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " Apr 22 19:06:05.852330 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.852267 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqcks\" (UniqueName: \"kubernetes.io/projected/ea3777a4-31a2-460c-b453-d828f465a603-kube-api-access-rqcks\") pod \"ea3777a4-31a2-460c-b453-d828f465a603\" (UID: \"ea3777a4-31a2-460c-b453-d828f465a603\") " Apr 22 19:06:05.852447 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.852430 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea3777a4-31a2-460c-b453-d828f465a603-success-200-isvc-70f3d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-70f3d-kube-rbac-proxy-sar-config") pod "ea3777a4-31a2-460c-b453-d828f465a603" (UID: "ea3777a4-31a2-460c-b453-d828f465a603"). InnerVolumeSpecName "success-200-isvc-70f3d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:06:05.854216 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.854182 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea3777a4-31a2-460c-b453-d828f465a603-kube-api-access-rqcks" (OuterVolumeSpecName: "kube-api-access-rqcks") pod "ea3777a4-31a2-460c-b453-d828f465a603" (UID: "ea3777a4-31a2-460c-b453-d828f465a603"). InnerVolumeSpecName "kube-api-access-rqcks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:05.854216 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.854200 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ea3777a4-31a2-460c-b453-d828f465a603" (UID: "ea3777a4-31a2-460c-b453-d828f465a603"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:05.952975 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.952867 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea3777a4-31a2-460c-b453-d828f465a603-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:06:05.952975 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.952921 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-70f3d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea3777a4-31a2-460c-b453-d828f465a603-success-200-isvc-70f3d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:06:05.952975 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:05.952936 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqcks\" (UniqueName: \"kubernetes.io/projected/ea3777a4-31a2-460c-b453-d828f465a603-kube-api-access-rqcks\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:06:06.396154 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:06.396109 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" event={"ID":"ea3777a4-31a2-460c-b453-d828f465a603","Type":"ContainerDied","Data":"26102093c5bde4860298a7dd052f461dd65deae87c86ba4b778c6aa84d039ac1"} Apr 22 19:06:06.396154 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:06.396150 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg" Apr 22 19:06:06.396584 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:06.396154 2572 scope.go:117] "RemoveContainer" containerID="db86617c6fe3315d72e7af914a01adbc2b88063b85481e020a2c25fadbb80a03" Apr 22 19:06:06.403846 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:06.403828 2572 scope.go:117] "RemoveContainer" containerID="c341cf310944cd99903df05802ecf7490a590f4cfde7a1443cf58d5feda21f5d" Apr 22 19:06:06.424371 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:06.411760 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg"] Apr 22 19:06:06.424371 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:06.413142 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-70f3d-predictor-f4754d89d-rtwtg"] Apr 22 19:06:08.204484 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:08.204450 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea3777a4-31a2-460c-b453-d828f465a603" path="/var/lib/kubelet/pods/ea3777a4-31a2-460c-b453-d828f465a603/volumes" Apr 22 19:06:10.396965 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:10.396933 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:10.397329 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:10.397261 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 19:06:20.398168 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:20.398130 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 19:06:30.398319 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:30.398279 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 19:06:38.390685 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.390652 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt"] Apr 22 19:06:38.391284 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.390949 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" containerID="cri-o://6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84" gracePeriod=30 Apr 22 19:06:38.391284 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.390991 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kube-rbac-proxy" containerID="cri-o://f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c" gracePeriod=30 Apr 22 19:06:38.433764 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.433734 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv"] Apr 22 19:06:38.434059 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.434046 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" Apr 22 19:06:38.434111 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.434060 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" Apr 22 19:06:38.434111 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.434071 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kube-rbac-proxy" Apr 22 19:06:38.434111 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.434077 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kube-rbac-proxy" Apr 22 19:06:38.434208 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.434120 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kserve-container" Apr 22 19:06:38.434208 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.434130 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea3777a4-31a2-460c-b453-d828f465a603" containerName="kube-rbac-proxy" Apr 22 19:06:38.437908 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.437882 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.440313 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.440292 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-23058-predictor-serving-cert\"" Apr 22 19:06:38.441235 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.441021 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-23058-kube-rbac-proxy-sar-config\"" Apr 22 19:06:38.447843 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.447811 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv"] Apr 22 19:06:38.595763 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.595733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/776e5fb7-2c36-4dc6-8640-05bc68ff731a-success-200-isvc-23058-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.595951 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.595792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9p5k\" (UniqueName: \"kubernetes.io/projected/776e5fb7-2c36-4dc6-8640-05bc68ff731a-kube-api-access-w9p5k\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.595951 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.595821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/776e5fb7-2c36-4dc6-8640-05bc68ff731a-proxy-tls\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.696581 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.696493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/776e5fb7-2c36-4dc6-8640-05bc68ff731a-success-200-isvc-23058-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.696581 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.696546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9p5k\" (UniqueName: \"kubernetes.io/projected/776e5fb7-2c36-4dc6-8640-05bc68ff731a-kube-api-access-w9p5k\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.696581 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.696567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/776e5fb7-2c36-4dc6-8640-05bc68ff731a-proxy-tls\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.697163 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.697139 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/776e5fb7-2c36-4dc6-8640-05bc68ff731a-success-200-isvc-23058-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.698951 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.698931 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/776e5fb7-2c36-4dc6-8640-05bc68ff731a-proxy-tls\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.705669 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.705646 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9p5k\" (UniqueName: \"kubernetes.io/projected/776e5fb7-2c36-4dc6-8640-05bc68ff731a-kube-api-access-w9p5k\") pod \"success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.749990 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.749954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:38.874539 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:38.874509 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv"] Apr 22 19:06:38.877726 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:06:38.877700 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod776e5fb7_2c36_4dc6_8640_05bc68ff731a.slice/crio-15f94bc5ba6bf36035c896de6914900c13855a5c81a71f206431bb6f458ea3c0 WatchSource:0}: Error finding container 15f94bc5ba6bf36035c896de6914900c13855a5c81a71f206431bb6f458ea3c0: Status 404 returned error can't find the container with id 15f94bc5ba6bf36035c896de6914900c13855a5c81a71f206431bb6f458ea3c0 Apr 22 19:06:39.485835 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:39.485802 2572 generic.go:358] "Generic (PLEG): container finished" podID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerID="f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c" exitCode=2 Apr 22 19:06:39.486285 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:39.485869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" event={"ID":"0a7e6e64-c5d9-42b0-9de6-4880b17a1529","Type":"ContainerDied","Data":"f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c"} Apr 22 19:06:39.487413 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:39.487388 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" event={"ID":"776e5fb7-2c36-4dc6-8640-05bc68ff731a","Type":"ContainerStarted","Data":"bbd08e889cac36ac3bb94dec60d329918e850de11590104718f583ab65e56ba8"} Apr 22 19:06:39.487527 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:39.487418 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" event={"ID":"776e5fb7-2c36-4dc6-8640-05bc68ff731a","Type":"ContainerStarted","Data":"e727f66ea2165990d2256469720a8ff8449c77dbd4bb4e37c09785e5694cdaa2"} Apr 22 19:06:39.487527 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:39.487430 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" event={"ID":"776e5fb7-2c36-4dc6-8640-05bc68ff731a","Type":"ContainerStarted","Data":"15f94bc5ba6bf36035c896de6914900c13855a5c81a71f206431bb6f458ea3c0"} Apr 22 19:06:39.487597 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:39.487536 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:39.504920 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:39.504860 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podStartSLOduration=1.5048444650000001 podStartE2EDuration="1.504844465s" podCreationTimestamp="2026-04-22 19:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:39.503385672 +0000 UTC m=+1217.841371512" watchObservedRunningTime="2026-04-22 19:06:39.504844465 +0000 UTC m=+1217.842830300" Apr 22 19:06:40.397468 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:40.397428 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 19:06:40.490244 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:40.490213 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:40.491486 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:40.491448 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 19:06:41.492679 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.492635 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 19:06:41.732971 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.732944 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 19:06:41.819764 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.819729 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht5pv\" (UniqueName: \"kubernetes.io/projected/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-kube-api-access-ht5pv\") pod \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " Apr 22 19:06:41.819990 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.819815 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls\") pod \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " Apr 22 19:06:41.819990 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.819865 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-success-200-isvc-0e291-kube-rbac-proxy-sar-config\") pod \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\" (UID: \"0a7e6e64-c5d9-42b0-9de6-4880b17a1529\") " Apr 22 19:06:41.820304 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.820274 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-success-200-isvc-0e291-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-0e291-kube-rbac-proxy-sar-config") pod "0a7e6e64-c5d9-42b0-9de6-4880b17a1529" (UID: "0a7e6e64-c5d9-42b0-9de6-4880b17a1529"). InnerVolumeSpecName "success-200-isvc-0e291-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:06:41.821944 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.821920 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0a7e6e64-c5d9-42b0-9de6-4880b17a1529" (UID: "0a7e6e64-c5d9-42b0-9de6-4880b17a1529"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:06:41.821944 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.821927 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-kube-api-access-ht5pv" (OuterVolumeSpecName: "kube-api-access-ht5pv") pod "0a7e6e64-c5d9-42b0-9de6-4880b17a1529" (UID: "0a7e6e64-c5d9-42b0-9de6-4880b17a1529"). InnerVolumeSpecName "kube-api-access-ht5pv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:06:41.920765 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.920709 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-0e291-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-success-200-isvc-0e291-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:06:41.920765 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.920757 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ht5pv\" (UniqueName: \"kubernetes.io/projected/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-kube-api-access-ht5pv\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:06:41.920765 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:41.920769 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a7e6e64-c5d9-42b0-9de6-4880b17a1529-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:06:42.495860 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.495777 2572 generic.go:358] "Generic (PLEG): container finished" podID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerID="6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84" exitCode=0 Apr 22 19:06:42.495860 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.495847 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" event={"ID":"0a7e6e64-c5d9-42b0-9de6-4880b17a1529","Type":"ContainerDied","Data":"6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84"} Apr 22 19:06:42.495860 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.495854 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" Apr 22 19:06:42.496374 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.495873 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt" event={"ID":"0a7e6e64-c5d9-42b0-9de6-4880b17a1529","Type":"ContainerDied","Data":"413c12a054f76dfa2f2789717e9c20a461e75af621cf7261ebafa9d54d2b8140"} Apr 22 19:06:42.496374 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.495889 2572 scope.go:117] "RemoveContainer" containerID="f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c" Apr 22 19:06:42.503354 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.503338 2572 scope.go:117] "RemoveContainer" containerID="6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84" Apr 22 19:06:42.511257 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.511229 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt"] Apr 22 19:06:42.513471 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.513450 2572 scope.go:117] "RemoveContainer" containerID="f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c" Apr 22 19:06:42.514001 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:06:42.513814 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c\": container with ID starting with f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c not found: ID does not exist" containerID="f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c" Apr 22 19:06:42.514084 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.514008 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c"} err="failed to get container status \"f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c\": rpc error: code = NotFound desc = could not find container \"f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c\": container with ID starting with f26edf85bcd8b92c43056de94cef43dfeeb0fb4858e724918f677f05f73d450c not found: ID does not exist" Apr 22 19:06:42.514084 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.514037 2572 scope.go:117] "RemoveContainer" containerID="6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84" Apr 22 19:06:42.514326 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:06:42.514304 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84\": container with ID starting with 6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84 not found: ID does not exist" containerID="6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84" Apr 22 19:06:42.514384 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.514336 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84"} err="failed to get container status \"6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84\": rpc error: code = NotFound desc = could not find container \"6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84\": container with ID starting with 6bd2f2906926114e2be3b658659fc350dd582e9d820efdc23ecda7c38e1e5d84 not found: ID does not exist" Apr 22 19:06:42.515273 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:42.515256 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0e291-predictor-5b45dc76dd-4q8jt"] Apr 22 19:06:44.205204 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:44.205171 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" path="/var/lib/kubelet/pods/0a7e6e64-c5d9-42b0-9de6-4880b17a1529/volumes" Apr 22 19:06:46.497626 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:46.497599 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:06:46.498140 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:46.498113 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 19:06:50.398052 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:50.398019 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:06:56.498121 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:06:56.498079 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 19:07:06.498705 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:06.498619 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 19:07:12.734621 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.734524 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj"] Apr 22 19:07:12.735073 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.734830 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" containerID="cri-o://a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3" gracePeriod=30 Apr 22 19:07:12.735073 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.734857 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kube-rbac-proxy" containerID="cri-o://57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e" gracePeriod=30 Apr 22 19:07:12.766574 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.766547 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x"] Apr 22 19:07:12.766870 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.766857 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" Apr 22 19:07:12.766968 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.766871 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" Apr 22 19:07:12.766968 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.766887 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kube-rbac-proxy" Apr 22 19:07:12.766968 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.766894 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kube-rbac-proxy" Apr 22 19:07:12.766968 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.766956 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kserve-container" Apr 22 19:07:12.766968 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.766965 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a7e6e64-c5d9-42b0-9de6-4880b17a1529" containerName="kube-rbac-proxy" Apr 22 19:07:12.769971 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.769956 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:12.771846 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.771830 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f6fce-kube-rbac-proxy-sar-config\"" Apr 22 19:07:12.771967 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.771831 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-f6fce-predictor-serving-cert\"" Apr 22 19:07:12.780238 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.780212 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x"] Apr 22 19:07:12.949507 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.949475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f99c4911-3d98-40b7-82d2-f9a31411aa1c-success-200-isvc-f6fce-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:12.949683 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.949523 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:12.949683 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:12.949569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvzz\" (UniqueName: \"kubernetes.io/projected/f99c4911-3d98-40b7-82d2-f9a31411aa1c-kube-api-access-dfvzz\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.050771 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.050743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.050893 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.050793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvzz\" (UniqueName: \"kubernetes.io/projected/f99c4911-3d98-40b7-82d2-f9a31411aa1c-kube-api-access-dfvzz\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.050893 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.050835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f99c4911-3d98-40b7-82d2-f9a31411aa1c-success-200-isvc-f6fce-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.051002 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:07:13.050912 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-serving-cert: secret "success-200-isvc-f6fce-predictor-serving-cert" not found Apr 22 19:07:13.051002 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:07:13.050986 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls podName:f99c4911-3d98-40b7-82d2-f9a31411aa1c nodeName:}" failed. No retries permitted until 2026-04-22 19:07:13.550969518 +0000 UTC m=+1251.888955343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls") pod "success-200-isvc-f6fce-predictor-78694d6766-cnm2x" (UID: "f99c4911-3d98-40b7-82d2-f9a31411aa1c") : secret "success-200-isvc-f6fce-predictor-serving-cert" not found Apr 22 19:07:13.051539 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.051520 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f99c4911-3d98-40b7-82d2-f9a31411aa1c-success-200-isvc-f6fce-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.062232 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.062203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvzz\" (UniqueName: \"kubernetes.io/projected/f99c4911-3d98-40b7-82d2-f9a31411aa1c-kube-api-access-dfvzz\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.556269 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.556240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.558580 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.558556 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls\") pod \"success-200-isvc-f6fce-predictor-78694d6766-cnm2x\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.587613 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.587588 2572 generic.go:358] "Generic (PLEG): container finished" podID="7f80fa95-499a-400e-bb2b-8590ce422211" containerID="57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e" exitCode=2 Apr 22 19:07:13.587718 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.587653 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" event={"ID":"7f80fa95-499a-400e-bb2b-8590ce422211","Type":"ContainerDied","Data":"57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e"} Apr 22 19:07:13.679800 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.679764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:13.795416 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:13.795380 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x"] Apr 22 19:07:13.799384 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:07:13.799358 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf99c4911_3d98_40b7_82d2_f9a31411aa1c.slice/crio-f6f8d5e718cfa39ad279411208156a46b256ea457b8560981e2b90f2d7f0093b WatchSource:0}: Error finding container f6f8d5e718cfa39ad279411208156a46b256ea457b8560981e2b90f2d7f0093b: Status 404 returned error can't find the container with id f6f8d5e718cfa39ad279411208156a46b256ea457b8560981e2b90f2d7f0093b Apr 22 19:07:14.591319 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:14.591283 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" event={"ID":"f99c4911-3d98-40b7-82d2-f9a31411aa1c","Type":"ContainerStarted","Data":"85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706"} Apr 22 19:07:14.591319 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:14.591319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" event={"ID":"f99c4911-3d98-40b7-82d2-f9a31411aa1c","Type":"ContainerStarted","Data":"454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24"} Apr 22 19:07:14.591554 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:14.591328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" event={"ID":"f99c4911-3d98-40b7-82d2-f9a31411aa1c","Type":"ContainerStarted","Data":"f6f8d5e718cfa39ad279411208156a46b256ea457b8560981e2b90f2d7f0093b"} Apr 22 19:07:14.591554 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:14.591358 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:14.607889 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:14.607842 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podStartSLOduration=2.607830504 podStartE2EDuration="2.607830504s" podCreationTimestamp="2026-04-22 19:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:14.606420415 +0000 UTC m=+1252.944406256" watchObservedRunningTime="2026-04-22 19:07:14.607830504 +0000 UTC m=+1252.945816344" Apr 22 19:07:15.393363 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:15.393319 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 22 19:07:15.594139 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:15.594105 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:15.595354 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:15.595326 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:07:15.983164 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:15.983140 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:07:16.072912 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.072863 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f80fa95-499a-400e-bb2b-8590ce422211-success-200-isvc-50a5e-kube-rbac-proxy-sar-config\") pod \"7f80fa95-499a-400e-bb2b-8590ce422211\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " Apr 22 19:07:16.073050 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.072951 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f80fa95-499a-400e-bb2b-8590ce422211-proxy-tls\") pod \"7f80fa95-499a-400e-bb2b-8590ce422211\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " Apr 22 19:07:16.073050 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.073000 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkvx4\" (UniqueName: \"kubernetes.io/projected/7f80fa95-499a-400e-bb2b-8590ce422211-kube-api-access-fkvx4\") pod \"7f80fa95-499a-400e-bb2b-8590ce422211\" (UID: \"7f80fa95-499a-400e-bb2b-8590ce422211\") " Apr 22 19:07:16.073238 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.073214 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f80fa95-499a-400e-bb2b-8590ce422211-success-200-isvc-50a5e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-50a5e-kube-rbac-proxy-sar-config") pod "7f80fa95-499a-400e-bb2b-8590ce422211" (UID: "7f80fa95-499a-400e-bb2b-8590ce422211"). InnerVolumeSpecName "success-200-isvc-50a5e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:16.075015 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.074985 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f80fa95-499a-400e-bb2b-8590ce422211-kube-api-access-fkvx4" (OuterVolumeSpecName: "kube-api-access-fkvx4") pod "7f80fa95-499a-400e-bb2b-8590ce422211" (UID: "7f80fa95-499a-400e-bb2b-8590ce422211"). InnerVolumeSpecName "kube-api-access-fkvx4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:16.075123 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.075057 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f80fa95-499a-400e-bb2b-8590ce422211-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7f80fa95-499a-400e-bb2b-8590ce422211" (UID: "7f80fa95-499a-400e-bb2b-8590ce422211"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:16.173603 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.173564 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkvx4\" (UniqueName: \"kubernetes.io/projected/7f80fa95-499a-400e-bb2b-8590ce422211-kube-api-access-fkvx4\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:16.173603 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.173596 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-50a5e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7f80fa95-499a-400e-bb2b-8590ce422211-success-200-isvc-50a5e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:16.173801 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.173612 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f80fa95-499a-400e-bb2b-8590ce422211-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:16.498208 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.498122 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 22 19:07:16.598401 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.598364 2572 generic.go:358] "Generic (PLEG): container finished" podID="7f80fa95-499a-400e-bb2b-8590ce422211" containerID="a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3" exitCode=0 Apr 22 19:07:16.598561 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.598455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" event={"ID":"7f80fa95-499a-400e-bb2b-8590ce422211","Type":"ContainerDied","Data":"a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3"} Apr 22 19:07:16.598561 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.598498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" event={"ID":"7f80fa95-499a-400e-bb2b-8590ce422211","Type":"ContainerDied","Data":"feea9c2ce221a550221dfd17120ca7bc4f4553783958f449b0cb4e9400020ae9"} Apr 22 19:07:16.598561 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.598515 2572 scope.go:117] "RemoveContainer" containerID="57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e" Apr 22 19:07:16.598561 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.598469 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj" Apr 22 19:07:16.599001 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.598970 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:07:16.605855 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.605838 2572 scope.go:117] "RemoveContainer" containerID="a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3" Apr 22 19:07:16.612834 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.612806 2572 scope.go:117] "RemoveContainer" containerID="57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e" Apr 22 19:07:16.612976 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.612958 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj"] Apr 22 19:07:16.613181 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:07:16.613159 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e\": container with ID starting with 57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e not found: ID does not exist" containerID="57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e" Apr 22 19:07:16.613253 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.613192 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e"} err="failed to get container status \"57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e\": rpc error: code = NotFound desc = could not find container \"57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e\": container with ID starting with 57b3b08265ec346c8f178c1562c09ce19b397e1967e66cbe5c62430d26e3a20e not found: ID does not exist" Apr 22 19:07:16.613253 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.613218 2572 scope.go:117] "RemoveContainer" containerID="a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3" Apr 22 19:07:16.613470 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:07:16.613453 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3\": container with ID starting with a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3 not found: ID does not exist" containerID="a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3" Apr 22 19:07:16.613512 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.613476 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3"} err="failed to get container status \"a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3\": rpc error: code = NotFound desc = could not find container \"a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3\": container with ID starting with a553471e45c46cd7d89ceeccb953f1a018bf63c2d0eabed7f9840a32ce88fbd3 not found: ID does not exist" Apr 22 19:07:16.617418 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:16.617398 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-50a5e-predictor-6b875d9454-7mxtj"] Apr 22 19:07:18.206227 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:18.206191 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" path="/var/lib/kubelet/pods/7f80fa95-499a-400e-bb2b-8590ce422211/volumes" Apr 22 19:07:21.604201 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:21.604171 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:07:21.604687 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:21.604662 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:07:26.499576 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:26.499547 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:07:31.605139 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:31.605097 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:07:41.605189 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:41.605142 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:07:48.637107 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.637073 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv"] Apr 22 19:07:48.637522 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.637402 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" containerID="cri-o://e727f66ea2165990d2256469720a8ff8449c77dbd4bb4e37c09785e5694cdaa2" gracePeriod=30 Apr 22 19:07:48.637522 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.637446 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kube-rbac-proxy" containerID="cri-o://bbd08e889cac36ac3bb94dec60d329918e850de11590104718f583ab65e56ba8" gracePeriod=30 Apr 22 19:07:48.676216 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.676190 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd"] Apr 22 19:07:48.676464 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.676453 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kube-rbac-proxy" Apr 22 19:07:48.676506 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.676466 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kube-rbac-proxy" Apr 22 19:07:48.676506 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.676478 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" Apr 22 19:07:48.676506 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.676483 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" Apr 22 19:07:48.676597 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.676528 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kube-rbac-proxy" Apr 22 19:07:48.676597 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.676535 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7f80fa95-499a-400e-bb2b-8590ce422211" containerName="kserve-container" Apr 22 19:07:48.679392 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.679377 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:48.681686 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.681665 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ffb29-kube-rbac-proxy-sar-config\"" Apr 22 19:07:48.681686 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.681677 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-ffb29-predictor-serving-cert\"" Apr 22 19:07:48.695322 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.693404 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd"] Apr 22 19:07:48.698646 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.698620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbpgg\" (UniqueName: \"kubernetes.io/projected/72e8ea5f-a383-4292-80ae-d6039c848008-kube-api-access-jbpgg\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:48.698767 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.698660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72e8ea5f-a383-4292-80ae-d6039c848008-success-200-isvc-ffb29-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:48.698767 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.698694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:48.800097 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.800062 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbpgg\" (UniqueName: \"kubernetes.io/projected/72e8ea5f-a383-4292-80ae-d6039c848008-kube-api-access-jbpgg\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:48.800267 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.800105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72e8ea5f-a383-4292-80ae-d6039c848008-success-200-isvc-ffb29-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:48.800267 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.800127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:48.800267 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:07:48.800219 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-serving-cert: secret "success-200-isvc-ffb29-predictor-serving-cert" not found Apr 22 19:07:48.800415 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:07:48.800281 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls podName:72e8ea5f-a383-4292-80ae-d6039c848008 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:49.300261497 +0000 UTC m=+1287.638247333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls") pod "success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" (UID: "72e8ea5f-a383-4292-80ae-d6039c848008") : secret "success-200-isvc-ffb29-predictor-serving-cert" not found Apr 22 19:07:48.800728 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.800705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72e8ea5f-a383-4292-80ae-d6039c848008-success-200-isvc-ffb29-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:48.808195 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:48.808175 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbpgg\" (UniqueName: \"kubernetes.io/projected/72e8ea5f-a383-4292-80ae-d6039c848008-kube-api-access-jbpgg\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:49.304009 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:49.303974 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:49.306269 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:49.306251 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls\") pod \"success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:49.591118 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:49.591028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:49.688430 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:49.688397 2572 generic.go:358] "Generic (PLEG): container finished" podID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerID="bbd08e889cac36ac3bb94dec60d329918e850de11590104718f583ab65e56ba8" exitCode=2 Apr 22 19:07:49.688737 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:49.688457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" event={"ID":"776e5fb7-2c36-4dc6-8640-05bc68ff731a","Type":"ContainerDied","Data":"bbd08e889cac36ac3bb94dec60d329918e850de11590104718f583ab65e56ba8"} Apr 22 19:07:49.733532 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:49.733431 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd"] Apr 22 19:07:49.735989 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:07:49.735960 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72e8ea5f_a383_4292_80ae_d6039c848008.slice/crio-f01b911e8a3590ac665fbab3620d89f65fba347bb27764b2a47955aa1c65fb85 WatchSource:0}: Error finding container f01b911e8a3590ac665fbab3620d89f65fba347bb27764b2a47955aa1c65fb85: Status 404 returned error can't find the container with id f01b911e8a3590ac665fbab3620d89f65fba347bb27764b2a47955aa1c65fb85 Apr 22 19:07:50.692782 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:50.692742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" event={"ID":"72e8ea5f-a383-4292-80ae-d6039c848008","Type":"ContainerStarted","Data":"0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e"} Apr 22 19:07:50.692782 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:50.692785 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" event={"ID":"72e8ea5f-a383-4292-80ae-d6039c848008","Type":"ContainerStarted","Data":"d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a"} Apr 22 19:07:50.693194 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:50.692798 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" event={"ID":"72e8ea5f-a383-4292-80ae-d6039c848008","Type":"ContainerStarted","Data":"f01b911e8a3590ac665fbab3620d89f65fba347bb27764b2a47955aa1c65fb85"} Apr 22 19:07:50.693194 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:50.692886 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:50.708467 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:50.708417 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podStartSLOduration=2.7084030070000003 podStartE2EDuration="2.708403007s" podCreationTimestamp="2026-04-22 19:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:50.707460817 +0000 UTC m=+1289.045446657" watchObservedRunningTime="2026-04-22 19:07:50.708403007 +0000 UTC m=+1289.046388850" Apr 22 19:07:51.493753 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.493716 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 22 19:07:51.605105 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.605073 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 22 19:07:51.697070 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.697035 2572 generic.go:358] "Generic (PLEG): container finished" podID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerID="e727f66ea2165990d2256469720a8ff8449c77dbd4bb4e37c09785e5694cdaa2" exitCode=0 Apr 22 19:07:51.697507 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.697110 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" event={"ID":"776e5fb7-2c36-4dc6-8640-05bc68ff731a","Type":"ContainerDied","Data":"e727f66ea2165990d2256469720a8ff8449c77dbd4bb4e37c09785e5694cdaa2"} Apr 22 19:07:51.697507 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.697365 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:51.698565 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.698540 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:07:51.772767 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.772745 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:07:51.821369 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.821344 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/776e5fb7-2c36-4dc6-8640-05bc68ff731a-success-200-isvc-23058-kube-rbac-proxy-sar-config\") pod \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " Apr 22 19:07:51.821487 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.821373 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9p5k\" (UniqueName: \"kubernetes.io/projected/776e5fb7-2c36-4dc6-8640-05bc68ff731a-kube-api-access-w9p5k\") pod \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " Apr 22 19:07:51.821487 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.821422 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/776e5fb7-2c36-4dc6-8640-05bc68ff731a-proxy-tls\") pod \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\" (UID: \"776e5fb7-2c36-4dc6-8640-05bc68ff731a\") " Apr 22 19:07:51.821699 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.821670 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/776e5fb7-2c36-4dc6-8640-05bc68ff731a-success-200-isvc-23058-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-23058-kube-rbac-proxy-sar-config") pod "776e5fb7-2c36-4dc6-8640-05bc68ff731a" (UID: "776e5fb7-2c36-4dc6-8640-05bc68ff731a"). InnerVolumeSpecName "success-200-isvc-23058-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:51.823413 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.823384 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/776e5fb7-2c36-4dc6-8640-05bc68ff731a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "776e5fb7-2c36-4dc6-8640-05bc68ff731a" (UID: "776e5fb7-2c36-4dc6-8640-05bc68ff731a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:51.823413 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.823393 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/776e5fb7-2c36-4dc6-8640-05bc68ff731a-kube-api-access-w9p5k" (OuterVolumeSpecName: "kube-api-access-w9p5k") pod "776e5fb7-2c36-4dc6-8640-05bc68ff731a" (UID: "776e5fb7-2c36-4dc6-8640-05bc68ff731a"). InnerVolumeSpecName "kube-api-access-w9p5k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:51.922678 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.922644 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/776e5fb7-2c36-4dc6-8640-05bc68ff731a-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:51.922678 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.922671 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-23058-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/776e5fb7-2c36-4dc6-8640-05bc68ff731a-success-200-isvc-23058-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:51.922678 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:51.922681 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9p5k\" (UniqueName: \"kubernetes.io/projected/776e5fb7-2c36-4dc6-8640-05bc68ff731a-kube-api-access-w9p5k\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:07:52.700816 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:52.700780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" event={"ID":"776e5fb7-2c36-4dc6-8640-05bc68ff731a","Type":"ContainerDied","Data":"15f94bc5ba6bf36035c896de6914900c13855a5c81a71f206431bb6f458ea3c0"} Apr 22 19:07:52.701395 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:52.700822 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv" Apr 22 19:07:52.701395 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:52.700829 2572 scope.go:117] "RemoveContainer" containerID="bbd08e889cac36ac3bb94dec60d329918e850de11590104718f583ab65e56ba8" Apr 22 19:07:52.701395 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:52.701111 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:07:52.708303 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:52.708283 2572 scope.go:117] "RemoveContainer" containerID="e727f66ea2165990d2256469720a8ff8449c77dbd4bb4e37c09785e5694cdaa2" Apr 22 19:07:52.717022 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:52.717001 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv"] Apr 22 19:07:52.720370 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:52.720345 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-23058-predictor-68c8c4c7bb-7j2rv"] Apr 22 19:07:54.206482 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:54.206444 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" path="/var/lib/kubelet/pods/776e5fb7-2c36-4dc6-8640-05bc68ff731a/volumes" Apr 22 19:07:57.705815 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:57.705786 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:07:57.706364 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:07:57.706338 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:08:01.605063 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:08:01.605032 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:08:07.707159 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:08:07.707117 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:08:17.706561 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:08:17.706522 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:08:27.707114 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:08:27.707073 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:08:37.707108 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:08:37.707022 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:16:27.492646 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.492608 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x"] Apr 22 19:16:27.493167 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.492946 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" containerID="cri-o://454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24" gracePeriod=30 Apr 22 19:16:27.493167 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.493005 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kube-rbac-proxy" containerID="cri-o://85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706" gracePeriod=30 Apr 22 19:16:27.576498 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.576459 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb"] Apr 22 19:16:27.576760 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.576747 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" Apr 22 19:16:27.576760 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.576762 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" Apr 22 19:16:27.576858 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.576781 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kube-rbac-proxy" Apr 22 19:16:27.576858 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.576787 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kube-rbac-proxy" Apr 22 19:16:27.576858 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.576834 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kube-rbac-proxy" Apr 22 19:16:27.576858 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.576843 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="776e5fb7-2c36-4dc6-8640-05bc68ff731a" containerName="kserve-container" Apr 22 19:16:27.579790 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.579764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:27.582005 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.581966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c8908-predictor-serving-cert\"" Apr 22 19:16:27.582127 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.581971 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-c8908-kube-rbac-proxy-sar-config\"" Apr 22 19:16:27.593440 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.593409 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb"] Apr 22 19:16:27.688479 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.688440 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/577c7ee9-bb8b-48ec-854a-02e79872459b-success-200-isvc-c8908-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:27.688672 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.688488 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:27.688672 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.688508 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztc9h\" (UniqueName: \"kubernetes.io/projected/577c7ee9-bb8b-48ec-854a-02e79872459b-kube-api-access-ztc9h\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:27.789772 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.789740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/577c7ee9-bb8b-48ec-854a-02e79872459b-success-200-isvc-c8908-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:27.789980 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.789795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:27.789980 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.789826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztc9h\" (UniqueName: \"kubernetes.io/projected/577c7ee9-bb8b-48ec-854a-02e79872459b-kube-api-access-ztc9h\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:27.789980 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:16:27.789912 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-c8908-predictor-serving-cert: secret "success-200-isvc-c8908-predictor-serving-cert" not found Apr 22 19:16:27.790144 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:16:27.790004 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls podName:577c7ee9-bb8b-48ec-854a-02e79872459b nodeName:}" failed. No retries permitted until 2026-04-22 19:16:28.289972281 +0000 UTC m=+1806.627958109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls") pod "success-200-isvc-c8908-predictor-b7c844446-s6swb" (UID: "577c7ee9-bb8b-48ec-854a-02e79872459b") : secret "success-200-isvc-c8908-predictor-serving-cert" not found Apr 22 19:16:27.790384 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.790363 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/577c7ee9-bb8b-48ec-854a-02e79872459b-success-200-isvc-c8908-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:27.798182 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:27.798153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztc9h\" (UniqueName: \"kubernetes.io/projected/577c7ee9-bb8b-48ec-854a-02e79872459b-kube-api-access-ztc9h\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:28.094618 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:28.094530 2572 generic.go:358] "Generic (PLEG): container finished" podID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerID="85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706" exitCode=2 Apr 22 19:16:28.094618 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:28.094569 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" event={"ID":"f99c4911-3d98-40b7-82d2-f9a31411aa1c","Type":"ContainerDied","Data":"85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706"} Apr 22 19:16:28.294332 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:28.294290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:28.296593 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:28.296561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls\") pod \"success-200-isvc-c8908-predictor-b7c844446-s6swb\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:28.491620 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:28.491518 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:28.614867 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:28.614838 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb"] Apr 22 19:16:28.617381 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:16:28.617351 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod577c7ee9_bb8b_48ec_854a_02e79872459b.slice/crio-04d087aaebca6acad6ad85029930b1982291696c6d818959ea49293ae4e14f4a WatchSource:0}: Error finding container 04d087aaebca6acad6ad85029930b1982291696c6d818959ea49293ae4e14f4a: Status 404 returned error can't find the container with id 04d087aaebca6acad6ad85029930b1982291696c6d818959ea49293ae4e14f4a Apr 22 19:16:28.619962 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:28.619934 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:16:29.099004 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:29.098970 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" event={"ID":"577c7ee9-bb8b-48ec-854a-02e79872459b","Type":"ContainerStarted","Data":"e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb"} Apr 22 19:16:29.099004 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:29.099007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" event={"ID":"577c7ee9-bb8b-48ec-854a-02e79872459b","Type":"ContainerStarted","Data":"4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3"} Apr 22 19:16:29.099217 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:29.099018 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" event={"ID":"577c7ee9-bb8b-48ec-854a-02e79872459b","Type":"ContainerStarted","Data":"04d087aaebca6acad6ad85029930b1982291696c6d818959ea49293ae4e14f4a"} Apr 22 19:16:29.099217 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:29.099105 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:29.116014 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:29.115931 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podStartSLOduration=2.115911972 podStartE2EDuration="2.115911972s" podCreationTimestamp="2026-04-22 19:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:16:29.114785194 +0000 UTC m=+1807.452771035" watchObservedRunningTime="2026-04-22 19:16:29.115911972 +0000 UTC m=+1807.453897804" Apr 22 19:16:30.101784 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:30.101748 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:30.102951 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:30.102923 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:16:30.938271 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:30.938245 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:16:31.015189 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.015091 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f99c4911-3d98-40b7-82d2-f9a31411aa1c-success-200-isvc-f6fce-kube-rbac-proxy-sar-config\") pod \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " Apr 22 19:16:31.015189 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.015151 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls\") pod \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " Apr 22 19:16:31.015425 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.015325 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfvzz\" (UniqueName: \"kubernetes.io/projected/f99c4911-3d98-40b7-82d2-f9a31411aa1c-kube-api-access-dfvzz\") pod \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\" (UID: \"f99c4911-3d98-40b7-82d2-f9a31411aa1c\") " Apr 22 19:16:31.015484 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.015446 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99c4911-3d98-40b7-82d2-f9a31411aa1c-success-200-isvc-f6fce-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-f6fce-kube-rbac-proxy-sar-config") pod "f99c4911-3d98-40b7-82d2-f9a31411aa1c" (UID: "f99c4911-3d98-40b7-82d2-f9a31411aa1c"). InnerVolumeSpecName "success-200-isvc-f6fce-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:16:31.015550 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.015526 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-f6fce-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f99c4911-3d98-40b7-82d2-f9a31411aa1c-success-200-isvc-f6fce-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:16:31.017288 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.017263 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99c4911-3d98-40b7-82d2-f9a31411aa1c-kube-api-access-dfvzz" (OuterVolumeSpecName: "kube-api-access-dfvzz") pod "f99c4911-3d98-40b7-82d2-f9a31411aa1c" (UID: "f99c4911-3d98-40b7-82d2-f9a31411aa1c"). InnerVolumeSpecName "kube-api-access-dfvzz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:16:31.017397 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.017300 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f99c4911-3d98-40b7-82d2-f9a31411aa1c" (UID: "f99c4911-3d98-40b7-82d2-f9a31411aa1c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:16:31.105874 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.105840 2572 generic.go:358] "Generic (PLEG): container finished" podID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerID="454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24" exitCode=0 Apr 22 19:16:31.106341 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.105932 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" Apr 22 19:16:31.106341 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.105933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" event={"ID":"f99c4911-3d98-40b7-82d2-f9a31411aa1c","Type":"ContainerDied","Data":"454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24"} Apr 22 19:16:31.106341 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.105974 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x" event={"ID":"f99c4911-3d98-40b7-82d2-f9a31411aa1c","Type":"ContainerDied","Data":"f6f8d5e718cfa39ad279411208156a46b256ea457b8560981e2b90f2d7f0093b"} Apr 22 19:16:31.106341 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.106007 2572 scope.go:117] "RemoveContainer" containerID="85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706" Apr 22 19:16:31.106540 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.106429 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:16:31.114122 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.114105 2572 scope.go:117] "RemoveContainer" containerID="454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24" Apr 22 19:16:31.116319 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.116288 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dfvzz\" (UniqueName: \"kubernetes.io/projected/f99c4911-3d98-40b7-82d2-f9a31411aa1c-kube-api-access-dfvzz\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:16:31.116413 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.116323 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f99c4911-3d98-40b7-82d2-f9a31411aa1c-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:16:31.121519 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.121500 2572 scope.go:117] "RemoveContainer" containerID="85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706" Apr 22 19:16:31.121819 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:16:31.121801 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706\": container with ID starting with 85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706 not found: ID does not exist" containerID="85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706" Apr 22 19:16:31.121865 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.121829 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706"} err="failed to get container status \"85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706\": rpc error: code = NotFound desc = could not find container \"85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706\": container with ID starting with 85b1c5b4afbc61c9d492f3509d98ae2fcd45661941537c727a372527c8a2e706 not found: ID does not exist" Apr 22 19:16:31.121865 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.121851 2572 scope.go:117] "RemoveContainer" containerID="454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24" Apr 22 19:16:31.122167 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:16:31.122114 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24\": container with ID starting with 454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24 not found: ID does not exist" containerID="454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24" Apr 22 19:16:31.122167 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.122136 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24"} err="failed to get container status \"454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24\": rpc error: code = NotFound desc = could not find container \"454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24\": container with ID starting with 454ec223f8e043558c028f6fff9834c0a651ef722e61ed35b2d2c02164a75f24 not found: ID does not exist" Apr 22 19:16:31.126182 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.126051 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x"] Apr 22 19:16:31.128122 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:31.128101 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-f6fce-predictor-78694d6766-cnm2x"] Apr 22 19:16:32.204806 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:32.204769 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" path="/var/lib/kubelet/pods/f99c4911-3d98-40b7-82d2-f9a31411aa1c/volumes" Apr 22 19:16:36.110796 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:36.110762 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:16:36.111272 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:36.111235 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:16:46.111842 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:46.111800 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:16:56.111409 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:16:56.111366 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:17:03.423551 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.423514 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd"] Apr 22 19:17:03.424095 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.423799 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" containerID="cri-o://d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a" gracePeriod=30 Apr 22 19:17:03.424095 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.423838 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kube-rbac-proxy" containerID="cri-o://0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e" gracePeriod=30 Apr 22 19:17:03.443380 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.443342 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp"] Apr 22 19:17:03.443642 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.443628 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" Apr 22 19:17:03.443642 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.443641 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" Apr 22 19:17:03.443789 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.443652 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kube-rbac-proxy" Apr 22 19:17:03.443789 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.443657 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kube-rbac-proxy" Apr 22 19:17:03.443789 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.443699 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kserve-container" Apr 22 19:17:03.443789 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.443710 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f99c4911-3d98-40b7-82d2-f9a31411aa1c" containerName="kube-rbac-proxy" Apr 22 19:17:03.446504 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.446483 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.448504 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.448485 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e7803-predictor-serving-cert\"" Apr 22 19:17:03.448504 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.448495 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e7803-kube-rbac-proxy-sar-config\"" Apr 22 19:17:03.457598 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.457570 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp"] Apr 22 19:17:03.567116 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.567071 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbc95\" (UniqueName: \"kubernetes.io/projected/fed8332d-991a-446a-a7f2-afb45300ed5f-kube-api-access-nbc95\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.567116 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.567116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fed8332d-991a-446a-a7f2-afb45300ed5f-success-200-isvc-e7803-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.567332 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.567156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fed8332d-991a-446a-a7f2-afb45300ed5f-proxy-tls\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.668291 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.668247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fed8332d-991a-446a-a7f2-afb45300ed5f-success-200-isvc-e7803-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.668291 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.668293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fed8332d-991a-446a-a7f2-afb45300ed5f-proxy-tls\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.668520 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.668354 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbc95\" (UniqueName: \"kubernetes.io/projected/fed8332d-991a-446a-a7f2-afb45300ed5f-kube-api-access-nbc95\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.668958 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.668937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fed8332d-991a-446a-a7f2-afb45300ed5f-success-200-isvc-e7803-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.670806 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.670783 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fed8332d-991a-446a-a7f2-afb45300ed5f-proxy-tls\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.676497 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.676421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbc95\" (UniqueName: \"kubernetes.io/projected/fed8332d-991a-446a-a7f2-afb45300ed5f-kube-api-access-nbc95\") pod \"success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.757312 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.757262 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:03.880392 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:03.880244 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp"] Apr 22 19:17:03.882957 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:17:03.882922 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfed8332d_991a_446a_a7f2_afb45300ed5f.slice/crio-17a4de3a961e89273d28149bdfb74136382310eb0152d7950e5c10ca11611b01 WatchSource:0}: Error finding container 17a4de3a961e89273d28149bdfb74136382310eb0152d7950e5c10ca11611b01: Status 404 returned error can't find the container with id 17a4de3a961e89273d28149bdfb74136382310eb0152d7950e5c10ca11611b01 Apr 22 19:17:04.204954 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:04.204840 2572 generic.go:358] "Generic (PLEG): container finished" podID="72e8ea5f-a383-4292-80ae-d6039c848008" containerID="0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e" exitCode=2 Apr 22 19:17:04.205408 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:04.205379 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" event={"ID":"72e8ea5f-a383-4292-80ae-d6039c848008","Type":"ContainerDied","Data":"0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e"} Apr 22 19:17:04.206375 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:04.206349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" event={"ID":"fed8332d-991a-446a-a7f2-afb45300ed5f","Type":"ContainerStarted","Data":"024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058"} Apr 22 19:17:04.206486 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:04.206380 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" event={"ID":"fed8332d-991a-446a-a7f2-afb45300ed5f","Type":"ContainerStarted","Data":"eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70"} Apr 22 19:17:04.206486 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:04.206393 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" event={"ID":"fed8332d-991a-446a-a7f2-afb45300ed5f","Type":"ContainerStarted","Data":"17a4de3a961e89273d28149bdfb74136382310eb0152d7950e5c10ca11611b01"} Apr 22 19:17:04.206573 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:04.206492 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:04.222727 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:04.222670 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podStartSLOduration=1.222654738 podStartE2EDuration="1.222654738s" podCreationTimestamp="2026-04-22 19:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:17:04.220843871 +0000 UTC m=+1842.558829715" watchObservedRunningTime="2026-04-22 19:17:04.222654738 +0000 UTC m=+1842.560640646" Apr 22 19:17:05.208959 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:05.208930 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:05.210279 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:05.210244 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:17:06.112241 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:06.112199 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:17:06.211450 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:06.211411 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:17:06.971622 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:06.971591 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:17:07.098660 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.098625 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls\") pod \"72e8ea5f-a383-4292-80ae-d6039c848008\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " Apr 22 19:17:07.098835 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.098690 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72e8ea5f-a383-4292-80ae-d6039c848008-success-200-isvc-ffb29-kube-rbac-proxy-sar-config\") pod \"72e8ea5f-a383-4292-80ae-d6039c848008\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " Apr 22 19:17:07.098835 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.098717 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbpgg\" (UniqueName: \"kubernetes.io/projected/72e8ea5f-a383-4292-80ae-d6039c848008-kube-api-access-jbpgg\") pod \"72e8ea5f-a383-4292-80ae-d6039c848008\" (UID: \"72e8ea5f-a383-4292-80ae-d6039c848008\") " Apr 22 19:17:07.099160 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.099127 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72e8ea5f-a383-4292-80ae-d6039c848008-success-200-isvc-ffb29-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-ffb29-kube-rbac-proxy-sar-config") pod "72e8ea5f-a383-4292-80ae-d6039c848008" (UID: "72e8ea5f-a383-4292-80ae-d6039c848008"). InnerVolumeSpecName "success-200-isvc-ffb29-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:17:07.100937 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.100891 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72e8ea5f-a383-4292-80ae-d6039c848008-kube-api-access-jbpgg" (OuterVolumeSpecName: "kube-api-access-jbpgg") pod "72e8ea5f-a383-4292-80ae-d6039c848008" (UID: "72e8ea5f-a383-4292-80ae-d6039c848008"). InnerVolumeSpecName "kube-api-access-jbpgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:07.100999 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.100935 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "72e8ea5f-a383-4292-80ae-d6039c848008" (UID: "72e8ea5f-a383-4292-80ae-d6039c848008"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:17:07.199778 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.199735 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jbpgg\" (UniqueName: \"kubernetes.io/projected/72e8ea5f-a383-4292-80ae-d6039c848008-kube-api-access-jbpgg\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:17:07.199778 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.199769 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/72e8ea5f-a383-4292-80ae-d6039c848008-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:17:07.199778 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.199785 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-ffb29-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/72e8ea5f-a383-4292-80ae-d6039c848008-success-200-isvc-ffb29-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:17:07.217104 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.217070 2572 generic.go:358] "Generic (PLEG): container finished" podID="72e8ea5f-a383-4292-80ae-d6039c848008" containerID="d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a" exitCode=0 Apr 22 19:17:07.217481 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.217153 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" event={"ID":"72e8ea5f-a383-4292-80ae-d6039c848008","Type":"ContainerDied","Data":"d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a"} Apr 22 19:17:07.217481 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.217196 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" event={"ID":"72e8ea5f-a383-4292-80ae-d6039c848008","Type":"ContainerDied","Data":"f01b911e8a3590ac665fbab3620d89f65fba347bb27764b2a47955aa1c65fb85"} Apr 22 19:17:07.217481 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.217213 2572 scope.go:117] "RemoveContainer" containerID="0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e" Apr 22 19:17:07.217481 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.217166 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd" Apr 22 19:17:07.224657 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.224620 2572 scope.go:117] "RemoveContainer" containerID="d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a" Apr 22 19:17:07.232005 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.231984 2572 scope.go:117] "RemoveContainer" containerID="0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e" Apr 22 19:17:07.232273 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:17:07.232254 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e\": container with ID starting with 0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e not found: ID does not exist" containerID="0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e" Apr 22 19:17:07.232319 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.232283 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e"} err="failed to get container status \"0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e\": rpc error: code = NotFound desc = could not find container \"0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e\": container with ID starting with 0ed16068c6b9312b0ddc23c1d3fd38c8e704e655e0716d993a8cab0ba04c0e2e not found: ID does not exist" Apr 22 19:17:07.232319 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.232305 2572 scope.go:117] "RemoveContainer" containerID="d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a" Apr 22 19:17:07.232555 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:17:07.232534 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a\": container with ID starting with d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a not found: ID does not exist" containerID="d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a" Apr 22 19:17:07.232614 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.232560 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a"} err="failed to get container status \"d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a\": rpc error: code = NotFound desc = could not find container \"d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a\": container with ID starting with d503c248960261ddd53aa426ca3b99c242e8914a5b6e48908e2c8b9bfaca780a not found: ID does not exist" Apr 22 19:17:07.241184 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.239559 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd"] Apr 22 19:17:07.247159 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:07.247130 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-ffb29-predictor-b8d4948fb-5wxhd"] Apr 22 19:17:08.204654 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:08.204614 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" path="/var/lib/kubelet/pods/72e8ea5f-a383-4292-80ae-d6039c848008/volumes" Apr 22 19:17:11.216053 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:11.216024 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:17:11.216637 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:11.216572 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:17:16.111240 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:16.111200 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:17:21.216721 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:21.216679 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:17:26.112687 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:26.112654 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:17:31.216513 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:31.216470 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:17:41.217462 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:41.217418 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:17:47.828443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.828404 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb"] Apr 22 19:17:47.828959 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.828706 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" containerID="cri-o://4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3" gracePeriod=30 Apr 22 19:17:47.828959 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.828756 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kube-rbac-proxy" containerID="cri-o://e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb" gracePeriod=30 Apr 22 19:17:47.850587 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.850553 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx"] Apr 22 19:17:47.850915 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.850886 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kube-rbac-proxy" Apr 22 19:17:47.851000 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.850921 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kube-rbac-proxy" Apr 22 19:17:47.851000 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.850956 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" Apr 22 19:17:47.851000 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.850966 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" Apr 22 19:17:47.851152 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.851026 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kserve-container" Apr 22 19:17:47.851152 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.851040 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="72e8ea5f-a383-4292-80ae-d6039c848008" containerName="kube-rbac-proxy" Apr 22 19:17:47.853950 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.853926 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:47.855864 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.855841 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-352f4-predictor-serving-cert\"" Apr 22 19:17:47.856013 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.855862 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-352f4-kube-rbac-proxy-sar-config\"" Apr 22 19:17:47.861704 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.861673 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx"] Apr 22 19:17:47.906547 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.906506 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:47.906745 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.906574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815909cd-48dc-47d0-80b7-aa2812993f0a-success-200-isvc-352f4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:47.906745 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:47.906603 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7jhw\" (UniqueName: \"kubernetes.io/projected/815909cd-48dc-47d0-80b7-aa2812993f0a-kube-api-access-s7jhw\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.007921 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.007860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.007921 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.007924 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815909cd-48dc-47d0-80b7-aa2812993f0a-success-200-isvc-352f4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.008175 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.007957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7jhw\" (UniqueName: \"kubernetes.io/projected/815909cd-48dc-47d0-80b7-aa2812993f0a-kube-api-access-s7jhw\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.008175 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:17:48.008025 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-352f4-predictor-serving-cert: secret "success-200-isvc-352f4-predictor-serving-cert" not found Apr 22 19:17:48.008175 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:17:48.008109 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls podName:815909cd-48dc-47d0-80b7-aa2812993f0a nodeName:}" failed. No retries permitted until 2026-04-22 19:17:48.508088313 +0000 UTC m=+1886.846074143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls") pod "success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" (UID: "815909cd-48dc-47d0-80b7-aa2812993f0a") : secret "success-200-isvc-352f4-predictor-serving-cert" not found Apr 22 19:17:48.008616 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.008595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815909cd-48dc-47d0-80b7-aa2812993f0a-success-200-isvc-352f4-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.018129 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.018098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7jhw\" (UniqueName: \"kubernetes.io/projected/815909cd-48dc-47d0-80b7-aa2812993f0a-kube-api-access-s7jhw\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.333660 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.333623 2572 generic.go:358] "Generic (PLEG): container finished" podID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerID="e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb" exitCode=2 Apr 22 19:17:48.333859 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.333696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" event={"ID":"577c7ee9-bb8b-48ec-854a-02e79872459b","Type":"ContainerDied","Data":"e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb"} Apr 22 19:17:48.511652 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.511611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.514073 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.514051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls\") pod \"success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.765796 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.765703 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:48.887958 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:48.887924 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx"] Apr 22 19:17:48.890166 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:17:48.890134 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815909cd_48dc_47d0_80b7_aa2812993f0a.slice/crio-cec59bcde55045641b1bc6704c0357ea198b10b4502421cae07f2569119e06ae WatchSource:0}: Error finding container cec59bcde55045641b1bc6704c0357ea198b10b4502421cae07f2569119e06ae: Status 404 returned error can't find the container with id cec59bcde55045641b1bc6704c0357ea198b10b4502421cae07f2569119e06ae Apr 22 19:17:49.341635 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:49.341596 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" event={"ID":"815909cd-48dc-47d0-80b7-aa2812993f0a","Type":"ContainerStarted","Data":"5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029"} Apr 22 19:17:49.341635 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:49.341639 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" event={"ID":"815909cd-48dc-47d0-80b7-aa2812993f0a","Type":"ContainerStarted","Data":"9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814"} Apr 22 19:17:49.341635 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:49.341651 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" event={"ID":"815909cd-48dc-47d0-80b7-aa2812993f0a","Type":"ContainerStarted","Data":"cec59bcde55045641b1bc6704c0357ea198b10b4502421cae07f2569119e06ae"} Apr 22 19:17:49.341978 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:49.341751 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:49.357920 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:49.357846 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podStartSLOduration=2.3578261879999998 podStartE2EDuration="2.357826188s" podCreationTimestamp="2026-04-22 19:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:17:49.356123272 +0000 UTC m=+1887.694109112" watchObservedRunningTime="2026-04-22 19:17:49.357826188 +0000 UTC m=+1887.695812028" Apr 22 19:17:50.344616 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:50.344575 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:50.345807 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:50.345781 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 22 19:17:51.107081 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.107042 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.25:8643/healthz\": dial tcp 10.132.0.25:8643: connect: connection refused" Apr 22 19:17:51.216537 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.216491 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:17:51.352427 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.352385 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 22 19:17:51.572068 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.572038 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:17:51.638207 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.638166 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/577c7ee9-bb8b-48ec-854a-02e79872459b-success-200-isvc-c8908-kube-rbac-proxy-sar-config\") pod \"577c7ee9-bb8b-48ec-854a-02e79872459b\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " Apr 22 19:17:51.638409 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.638233 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls\") pod \"577c7ee9-bb8b-48ec-854a-02e79872459b\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " Apr 22 19:17:51.638409 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.638268 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztc9h\" (UniqueName: \"kubernetes.io/projected/577c7ee9-bb8b-48ec-854a-02e79872459b-kube-api-access-ztc9h\") pod \"577c7ee9-bb8b-48ec-854a-02e79872459b\" (UID: \"577c7ee9-bb8b-48ec-854a-02e79872459b\") " Apr 22 19:17:51.638667 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.638635 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/577c7ee9-bb8b-48ec-854a-02e79872459b-success-200-isvc-c8908-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-c8908-kube-rbac-proxy-sar-config") pod "577c7ee9-bb8b-48ec-854a-02e79872459b" (UID: "577c7ee9-bb8b-48ec-854a-02e79872459b"). InnerVolumeSpecName "success-200-isvc-c8908-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:17:51.640407 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.640374 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "577c7ee9-bb8b-48ec-854a-02e79872459b" (UID: "577c7ee9-bb8b-48ec-854a-02e79872459b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:17:51.640518 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.640439 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577c7ee9-bb8b-48ec-854a-02e79872459b-kube-api-access-ztc9h" (OuterVolumeSpecName: "kube-api-access-ztc9h") pod "577c7ee9-bb8b-48ec-854a-02e79872459b" (UID: "577c7ee9-bb8b-48ec-854a-02e79872459b"). InnerVolumeSpecName "kube-api-access-ztc9h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:17:51.739033 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.738927 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-c8908-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/577c7ee9-bb8b-48ec-854a-02e79872459b-success-200-isvc-c8908-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:17:51.739033 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.738960 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/577c7ee9-bb8b-48ec-854a-02e79872459b-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:17:51.739033 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:51.738972 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ztc9h\" (UniqueName: \"kubernetes.io/projected/577c7ee9-bb8b-48ec-854a-02e79872459b-kube-api-access-ztc9h\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:17:52.355893 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.355854 2572 generic.go:358] "Generic (PLEG): container finished" podID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerID="4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3" exitCode=0 Apr 22 19:17:52.356325 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.355932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" event={"ID":"577c7ee9-bb8b-48ec-854a-02e79872459b","Type":"ContainerDied","Data":"4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3"} Apr 22 19:17:52.356325 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.355977 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" event={"ID":"577c7ee9-bb8b-48ec-854a-02e79872459b","Type":"ContainerDied","Data":"04d087aaebca6acad6ad85029930b1982291696c6d818959ea49293ae4e14f4a"} Apr 22 19:17:52.356325 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.355992 2572 scope.go:117] "RemoveContainer" containerID="e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb" Apr 22 19:17:52.356325 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.355948 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb" Apr 22 19:17:52.363763 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.363741 2572 scope.go:117] "RemoveContainer" containerID="4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3" Apr 22 19:17:52.371286 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.371252 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb"] Apr 22 19:17:52.371572 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.371548 2572 scope.go:117] "RemoveContainer" containerID="e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb" Apr 22 19:17:52.371920 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:17:52.371883 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb\": container with ID starting with e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb not found: ID does not exist" containerID="e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb" Apr 22 19:17:52.372022 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.371934 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb"} err="failed to get container status \"e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb\": rpc error: code = NotFound desc = could not find container \"e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb\": container with ID starting with e57b1efac458b31399e96a6f5d93d0495184a296d4da3203616503628cf36feb not found: ID does not exist" Apr 22 19:17:52.372022 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.371970 2572 scope.go:117] "RemoveContainer" containerID="4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3" Apr 22 19:17:52.372288 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:17:52.372269 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3\": container with ID starting with 4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3 not found: ID does not exist" containerID="4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3" Apr 22 19:17:52.372339 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.372294 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3"} err="failed to get container status \"4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3\": rpc error: code = NotFound desc = could not find container \"4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3\": container with ID starting with 4e8b3185950a654f482e10757489bc39d1f4f2bec0e25934802df31c2a201ea3 not found: ID does not exist" Apr 22 19:17:52.372531 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:52.372513 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-c8908-predictor-b7c844446-s6swb"] Apr 22 19:17:54.205294 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:54.205255 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" path="/var/lib/kubelet/pods/577c7ee9-bb8b-48ec-854a-02e79872459b/volumes" Apr 22 19:17:56.357037 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:56.357003 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:17:56.357508 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:17:56.357481 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 22 19:18:01.217049 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:18:01.217020 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:18:06.358005 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:18:06.357964 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 22 19:18:16.358236 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:18:16.358195 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 22 19:18:26.357458 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:18:26.357416 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 22 19:18:36.359027 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:18:36.358995 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:27:02.733303 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:02.733211 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx"] Apr 22 19:27:02.735694 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:02.733593 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" containerID="cri-o://9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814" gracePeriod=30 Apr 22 19:27:02.735694 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:02.733626 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kube-rbac-proxy" containerID="cri-o://5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029" gracePeriod=30 Apr 22 19:27:02.869553 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:02.869519 2572 generic.go:358] "Generic (PLEG): container finished" podID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerID="5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029" exitCode=2 Apr 22 19:27:02.869721 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:02.869592 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" event={"ID":"815909cd-48dc-47d0-80b7-aa2812993f0a","Type":"ContainerDied","Data":"5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029"} Apr 22 19:27:05.773240 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.773216 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:27:05.849651 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.849560 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7jhw\" (UniqueName: \"kubernetes.io/projected/815909cd-48dc-47d0-80b7-aa2812993f0a-kube-api-access-s7jhw\") pod \"815909cd-48dc-47d0-80b7-aa2812993f0a\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " Apr 22 19:27:05.849651 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.849607 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls\") pod \"815909cd-48dc-47d0-80b7-aa2812993f0a\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " Apr 22 19:27:05.849849 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.849658 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815909cd-48dc-47d0-80b7-aa2812993f0a-success-200-isvc-352f4-kube-rbac-proxy-sar-config\") pod \"815909cd-48dc-47d0-80b7-aa2812993f0a\" (UID: \"815909cd-48dc-47d0-80b7-aa2812993f0a\") " Apr 22 19:27:05.850078 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.850053 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/815909cd-48dc-47d0-80b7-aa2812993f0a-success-200-isvc-352f4-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-352f4-kube-rbac-proxy-sar-config") pod "815909cd-48dc-47d0-80b7-aa2812993f0a" (UID: "815909cd-48dc-47d0-80b7-aa2812993f0a"). InnerVolumeSpecName "success-200-isvc-352f4-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:05.851731 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.851709 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "815909cd-48dc-47d0-80b7-aa2812993f0a" (UID: "815909cd-48dc-47d0-80b7-aa2812993f0a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:05.851786 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.851722 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815909cd-48dc-47d0-80b7-aa2812993f0a-kube-api-access-s7jhw" (OuterVolumeSpecName: "kube-api-access-s7jhw") pod "815909cd-48dc-47d0-80b7-aa2812993f0a" (UID: "815909cd-48dc-47d0-80b7-aa2812993f0a"). InnerVolumeSpecName "kube-api-access-s7jhw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:05.879430 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.879395 2572 generic.go:358] "Generic (PLEG): container finished" podID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerID="9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814" exitCode=0 Apr 22 19:27:05.879580 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.879464 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" Apr 22 19:27:05.879580 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.879478 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" event={"ID":"815909cd-48dc-47d0-80b7-aa2812993f0a","Type":"ContainerDied","Data":"9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814"} Apr 22 19:27:05.879580 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.879515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx" event={"ID":"815909cd-48dc-47d0-80b7-aa2812993f0a","Type":"ContainerDied","Data":"cec59bcde55045641b1bc6704c0357ea198b10b4502421cae07f2569119e06ae"} Apr 22 19:27:05.879580 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.879530 2572 scope.go:117] "RemoveContainer" containerID="5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029" Apr 22 19:27:05.887671 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.887651 2572 scope.go:117] "RemoveContainer" containerID="9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814" Apr 22 19:27:05.894403 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.894385 2572 scope.go:117] "RemoveContainer" containerID="5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029" Apr 22 19:27:05.894659 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:27:05.894631 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029\": container with ID starting with 5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029 not found: ID does not exist" containerID="5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029" Apr 22 19:27:05.894772 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.894669 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029"} err="failed to get container status \"5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029\": rpc error: code = NotFound desc = could not find container \"5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029\": container with ID starting with 5ce78b25d711e9fe8ddcd5a14abb2cba08d6a814d72d931e52463d7967052029 not found: ID does not exist" Apr 22 19:27:05.894772 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.894693 2572 scope.go:117] "RemoveContainer" containerID="9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814" Apr 22 19:27:05.894954 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:27:05.894936 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814\": container with ID starting with 9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814 not found: ID does not exist" containerID="9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814" Apr 22 19:27:05.895023 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.894962 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814"} err="failed to get container status \"9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814\": rpc error: code = NotFound desc = could not find container \"9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814\": container with ID starting with 9d44f4a5eb0627644e1bf7142627f30107ce865d904740316f7fa5ec9da35814 not found: ID does not exist" Apr 22 19:27:05.898951 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.898929 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx"] Apr 22 19:27:05.902326 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.902305 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-352f4-predictor-5f6bdb8c66-j6kcx"] Apr 22 19:27:05.950602 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.950564 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7jhw\" (UniqueName: \"kubernetes.io/projected/815909cd-48dc-47d0-80b7-aa2812993f0a-kube-api-access-s7jhw\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:27:05.950602 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.950600 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/815909cd-48dc-47d0-80b7-aa2812993f0a-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:27:05.950793 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:05.950612 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-352f4-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/815909cd-48dc-47d0-80b7-aa2812993f0a-success-200-isvc-352f4-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:27:06.205211 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:27:06.205131 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" path="/var/lib/kubelet/pods/815909cd-48dc-47d0-80b7-aa2812993f0a/volumes" Apr 22 19:34:33.089847 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:33.089814 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp"] Apr 22 19:34:33.090330 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:33.090133 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" containerID="cri-o://eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70" gracePeriod=30 Apr 22 19:34:33.090330 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:33.090168 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kube-rbac-proxy" containerID="cri-o://024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058" gracePeriod=30 Apr 22 19:34:34.051872 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:34.051840 2572 generic.go:358] "Generic (PLEG): container finished" podID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerID="024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058" exitCode=2 Apr 22 19:34:34.052065 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:34.051937 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" event={"ID":"fed8332d-991a-446a-a7f2-afb45300ed5f","Type":"ContainerDied","Data":"024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058"} Apr 22 19:34:35.930227 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:35.930205 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:34:36.058093 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.058058 2572 generic.go:358] "Generic (PLEG): container finished" podID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerID="eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70" exitCode=0 Apr 22 19:34:36.058265 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.058129 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" Apr 22 19:34:36.058265 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.058123 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" event={"ID":"fed8332d-991a-446a-a7f2-afb45300ed5f","Type":"ContainerDied","Data":"eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70"} Apr 22 19:34:36.058265 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.058234 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp" event={"ID":"fed8332d-991a-446a-a7f2-afb45300ed5f","Type":"ContainerDied","Data":"17a4de3a961e89273d28149bdfb74136382310eb0152d7950e5c10ca11611b01"} Apr 22 19:34:36.058265 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.058250 2572 scope.go:117] "RemoveContainer" containerID="024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058" Apr 22 19:34:36.065452 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.065436 2572 scope.go:117] "RemoveContainer" containerID="eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70" Apr 22 19:34:36.071870 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.071847 2572 scope.go:117] "RemoveContainer" containerID="024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058" Apr 22 19:34:36.072136 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:34:36.072118 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058\": container with ID starting with 024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058 not found: ID does not exist" containerID="024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058" Apr 22 19:34:36.072182 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.072145 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058"} err="failed to get container status \"024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058\": rpc error: code = NotFound desc = could not find container \"024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058\": container with ID starting with 024eb5d2afa0b983081531a5464096892315b444e2e8a3a6ac8621f4948ab058 not found: ID does not exist" Apr 22 19:34:36.072182 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.072162 2572 scope.go:117] "RemoveContainer" containerID="eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70" Apr 22 19:34:36.072366 ip-10-0-138-84 kubenswrapper[2572]: E0422 19:34:36.072350 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70\": container with ID starting with eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70 not found: ID does not exist" containerID="eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70" Apr 22 19:34:36.072405 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.072372 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70"} err="failed to get container status \"eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70\": rpc error: code = NotFound desc = could not find container \"eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70\": container with ID starting with eb04c0bff903eb126a34445beea75000e10fd73ba09dd5eef47f9acbd7595c70 not found: ID does not exist" Apr 22 19:34:36.109784 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.109761 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbc95\" (UniqueName: \"kubernetes.io/projected/fed8332d-991a-446a-a7f2-afb45300ed5f-kube-api-access-nbc95\") pod \"fed8332d-991a-446a-a7f2-afb45300ed5f\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " Apr 22 19:34:36.109878 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.109800 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fed8332d-991a-446a-a7f2-afb45300ed5f-success-200-isvc-e7803-kube-rbac-proxy-sar-config\") pod \"fed8332d-991a-446a-a7f2-afb45300ed5f\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " Apr 22 19:34:36.109948 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.109886 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fed8332d-991a-446a-a7f2-afb45300ed5f-proxy-tls\") pod \"fed8332d-991a-446a-a7f2-afb45300ed5f\" (UID: \"fed8332d-991a-446a-a7f2-afb45300ed5f\") " Apr 22 19:34:36.110225 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.110197 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fed8332d-991a-446a-a7f2-afb45300ed5f-success-200-isvc-e7803-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-e7803-kube-rbac-proxy-sar-config") pod "fed8332d-991a-446a-a7f2-afb45300ed5f" (UID: "fed8332d-991a-446a-a7f2-afb45300ed5f"). InnerVolumeSpecName "success-200-isvc-e7803-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:34:36.111779 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.111748 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fed8332d-991a-446a-a7f2-afb45300ed5f-kube-api-access-nbc95" (OuterVolumeSpecName: "kube-api-access-nbc95") pod "fed8332d-991a-446a-a7f2-afb45300ed5f" (UID: "fed8332d-991a-446a-a7f2-afb45300ed5f"). InnerVolumeSpecName "kube-api-access-nbc95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:34:36.111856 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.111759 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fed8332d-991a-446a-a7f2-afb45300ed5f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fed8332d-991a-446a-a7f2-afb45300ed5f" (UID: "fed8332d-991a-446a-a7f2-afb45300ed5f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:34:36.211137 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.211105 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fed8332d-991a-446a-a7f2-afb45300ed5f-proxy-tls\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:34:36.211137 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.211137 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nbc95\" (UniqueName: \"kubernetes.io/projected/fed8332d-991a-446a-a7f2-afb45300ed5f-kube-api-access-nbc95\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:34:36.211312 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.211152 2572 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-e7803-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fed8332d-991a-446a-a7f2-afb45300ed5f-success-200-isvc-e7803-kube-rbac-proxy-sar-config\") on node \"ip-10-0-138-84.ec2.internal\" DevicePath \"\"" Apr 22 19:34:36.373048 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.372980 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp"] Apr 22 19:34:36.378082 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:36.378051 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e7803-predictor-64dbd4f98d-vc5wp"] Apr 22 19:34:38.204460 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:38.204422 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" path="/var/lib/kubelet/pods/fed8332d-991a-446a-a7f2-afb45300ed5f/volumes" Apr 22 19:34:59.789729 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.789687 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6dp56/must-gather-mkwz7"] Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.789944 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.789966 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.789975 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.789981 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.789988 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.789994 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790002 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790007 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790022 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790027 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790033 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790038 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790078 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790086 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790094 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="815909cd-48dc-47d0-80b7-aa2812993f0a" containerName="kserve-container" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790100 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kserve-container" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790107 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="577c7ee9-bb8b-48ec-854a-02e79872459b" containerName="kube-rbac-proxy" Apr 22 19:34:59.790278 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.790113 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fed8332d-991a-446a-a7f2-afb45300ed5f" containerName="kserve-container" Apr 22 19:34:59.792856 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.792838 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dp56/must-gather-mkwz7" Apr 22 19:34:59.794886 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.794862 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6dp56\"/\"openshift-service-ca.crt\"" Apr 22 19:34:59.795434 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.795409 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6dp56\"/\"kube-root-ca.crt\"" Apr 22 19:34:59.795434 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.795423 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6dp56\"/\"default-dockercfg-dcnhf\"" Apr 22 19:34:59.798754 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.798730 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dp56/must-gather-mkwz7"] Apr 22 19:34:59.887810 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.887773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgjdw\" (UniqueName: \"kubernetes.io/projected/3ea7206e-3f64-47ab-860b-e2bf870a421b-kube-api-access-wgjdw\") pod \"must-gather-mkwz7\" (UID: \"3ea7206e-3f64-47ab-860b-e2bf870a421b\") " pod="openshift-must-gather-6dp56/must-gather-mkwz7" Apr 22 19:34:59.887810 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.887811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3ea7206e-3f64-47ab-860b-e2bf870a421b-must-gather-output\") pod \"must-gather-mkwz7\" (UID: \"3ea7206e-3f64-47ab-860b-e2bf870a421b\") " pod="openshift-must-gather-6dp56/must-gather-mkwz7" Apr 22 19:34:59.988277 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.988233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgjdw\" (UniqueName: \"kubernetes.io/projected/3ea7206e-3f64-47ab-860b-e2bf870a421b-kube-api-access-wgjdw\") pod \"must-gather-mkwz7\" (UID: \"3ea7206e-3f64-47ab-860b-e2bf870a421b\") " pod="openshift-must-gather-6dp56/must-gather-mkwz7" Apr 22 19:34:59.988277 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.988276 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3ea7206e-3f64-47ab-860b-e2bf870a421b-must-gather-output\") pod \"must-gather-mkwz7\" (UID: \"3ea7206e-3f64-47ab-860b-e2bf870a421b\") " pod="openshift-must-gather-6dp56/must-gather-mkwz7" Apr 22 19:34:59.988584 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.988563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3ea7206e-3f64-47ab-860b-e2bf870a421b-must-gather-output\") pod \"must-gather-mkwz7\" (UID: \"3ea7206e-3f64-47ab-860b-e2bf870a421b\") " pod="openshift-must-gather-6dp56/must-gather-mkwz7" Apr 22 19:34:59.995461 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:34:59.995439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgjdw\" (UniqueName: \"kubernetes.io/projected/3ea7206e-3f64-47ab-860b-e2bf870a421b-kube-api-access-wgjdw\") pod \"must-gather-mkwz7\" (UID: \"3ea7206e-3f64-47ab-860b-e2bf870a421b\") " pod="openshift-must-gather-6dp56/must-gather-mkwz7" Apr 22 19:35:00.103090 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:00.102980 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dp56/must-gather-mkwz7" Apr 22 19:35:00.217430 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:00.217399 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dp56/must-gather-mkwz7"] Apr 22 19:35:00.221404 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:35:00.221374 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ea7206e_3f64_47ab_860b_e2bf870a421b.slice/crio-8bbb4305d4b20f94e9333483a19c8519390e584dc9d99af5257d7374a5295d3e WatchSource:0}: Error finding container 8bbb4305d4b20f94e9333483a19c8519390e584dc9d99af5257d7374a5295d3e: Status 404 returned error can't find the container with id 8bbb4305d4b20f94e9333483a19c8519390e584dc9d99af5257d7374a5295d3e Apr 22 19:35:00.223027 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:00.223011 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:35:01.130056 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:01.130018 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/must-gather-mkwz7" event={"ID":"3ea7206e-3f64-47ab-860b-e2bf870a421b","Type":"ContainerStarted","Data":"8bbb4305d4b20f94e9333483a19c8519390e584dc9d99af5257d7374a5295d3e"} Apr 22 19:35:02.136307 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:02.136268 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/must-gather-mkwz7" event={"ID":"3ea7206e-3f64-47ab-860b-e2bf870a421b","Type":"ContainerStarted","Data":"9b910e50afa143fe8425aa63eb712a5ea5bb9fb78e3f236651271903cc29fdfd"} Apr 22 19:35:02.136855 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:02.136826 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/must-gather-mkwz7" event={"ID":"3ea7206e-3f64-47ab-860b-e2bf870a421b","Type":"ContainerStarted","Data":"038fcbd30a0d94895e8dfec365c10529f378b6d27f4de1392384437df0fc698b"} Apr 22 19:35:02.151602 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:02.151545 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6dp56/must-gather-mkwz7" podStartSLOduration=2.244999973 podStartE2EDuration="3.151527338s" podCreationTimestamp="2026-04-22 19:34:59 +0000 UTC" firstStartedPulling="2026-04-22 19:35:00.22317214 +0000 UTC m=+2918.561157960" lastFinishedPulling="2026-04-22 19:35:01.129699505 +0000 UTC m=+2919.467685325" observedRunningTime="2026-04-22 19:35:02.15002407 +0000 UTC m=+2920.488009910" watchObservedRunningTime="2026-04-22 19:35:02.151527338 +0000 UTC m=+2920.489513180" Apr 22 19:35:02.572677 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:02.572644 2572 ???:1] "http: TLS handshake error from 10.0.138.84:41524: EOF" Apr 22 19:35:02.581666 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:02.581612 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-f4bd6_a4ef7ff9-4ca4-496b-a2f3-fc3bd1ed7c4d/global-pull-secret-syncer/0.log" Apr 22 19:35:02.662059 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:02.662021 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2gt98_9cc1681f-a720-40b5-a0fa-1d414f3f4906/konnectivity-agent/0.log" Apr 22 19:35:02.781609 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:02.781574 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-84.ec2.internal_44e2894ddf571d488225d543b36d7bb8/haproxy/0.log" Apr 22 19:35:06.249574 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:06.249534 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gzln9_8e048393-e835-467d-8f0d-48a2d41d7bcd/node-exporter/0.log" Apr 22 19:35:06.271846 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:06.271817 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gzln9_8e048393-e835-467d-8f0d-48a2d41d7bcd/kube-rbac-proxy/0.log" Apr 22 19:35:06.292236 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:06.292209 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gzln9_8e048393-e835-467d-8f0d-48a2d41d7bcd/init-textfile/0.log" Apr 22 19:35:09.917017 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.916977 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns"] Apr 22 19:35:09.921845 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.921790 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:09.929998 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.929676 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns"] Apr 22 19:35:09.970154 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.970101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-sys\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:09.970349 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.970226 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-lib-modules\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:09.970349 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.970253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-podres\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:09.970477 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.970372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv7gp\" (UniqueName: \"kubernetes.io/projected/183305c4-8329-4680-9803-57d0c503c7be-kube-api-access-zv7gp\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:09.970477 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.970400 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-proc\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:09.971860 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.971793 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k779l_9151f183-6814-4a15-b1a7-9bd9ce7b5c59/dns/0.log" Apr 22 19:35:09.989588 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:09.989557 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k779l_9151f183-6814-4a15-b1a7-9bd9ce7b5c59/kube-rbac-proxy/0.log" Apr 22 19:35:10.054195 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.054168 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q65sk_86c95b36-8aa3-4c99-b0b6-3746cf836c8c/dns-node-resolver/0.log" Apr 22 19:35:10.071443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-sys\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.071443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071109 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-lib-modules\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.071443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071138 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-podres\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.071443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv7gp\" (UniqueName: \"kubernetes.io/projected/183305c4-8329-4680-9803-57d0c503c7be-kube-api-access-zv7gp\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.071443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-proc\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.071443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-proc\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.071443 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071370 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-sys\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.071782 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-lib-modules\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.071782 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.071541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/183305c4-8329-4680-9803-57d0c503c7be-podres\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.078946 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.078875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv7gp\" (UniqueName: \"kubernetes.io/projected/183305c4-8329-4680-9803-57d0c503c7be-kube-api-access-zv7gp\") pod \"perf-node-gather-daemonset-l66ns\" (UID: \"183305c4-8329-4680-9803-57d0c503c7be\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.234343 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.234256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:10.377034 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.377002 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns"] Apr 22 19:35:10.381706 ip-10-0-138-84 kubenswrapper[2572]: W0422 19:35:10.380612 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod183305c4_8329_4680_9803_57d0c503c7be.slice/crio-83462de180e0d57c5ba2d405337352e13356aa1f84311adb41c52c6b7855cc0b WatchSource:0}: Error finding container 83462de180e0d57c5ba2d405337352e13356aa1f84311adb41c52c6b7855cc0b: Status 404 returned error can't find the container with id 83462de180e0d57c5ba2d405337352e13356aa1f84311adb41c52c6b7855cc0b Apr 22 19:35:10.537545 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:10.537516 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x6px5_53d298d6-4725-419c-b9f4-0f58a63b1715/node-ca/0.log" Apr 22 19:35:11.170618 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:11.170583 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" event={"ID":"183305c4-8329-4680-9803-57d0c503c7be","Type":"ContainerStarted","Data":"a3d6d895b99918a8b271e9887cec92fc7a4e427c5768017d86d85cdd1057d1d6"} Apr 22 19:35:11.171113 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:11.170628 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" event={"ID":"183305c4-8329-4680-9803-57d0c503c7be","Type":"ContainerStarted","Data":"83462de180e0d57c5ba2d405337352e13356aa1f84311adb41c52c6b7855cc0b"} Apr 22 19:35:11.171113 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:11.170729 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:11.186368 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:11.185842 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" podStartSLOduration=2.185824781 podStartE2EDuration="2.185824781s" podCreationTimestamp="2026-04-22 19:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:35:11.184055428 +0000 UTC m=+2929.522041282" watchObservedRunningTime="2026-04-22 19:35:11.185824781 +0000 UTC m=+2929.523810623" Apr 22 19:35:11.523102 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:11.523077 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9mmmk_1ddee21f-46d4-45d6-bdfe-9fcc6baf236b/serve-healthcheck-canary/0.log" Apr 22 19:35:11.907614 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:11.907540 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5mzdz_5453deb9-8135-4a14-8b07-385e16aad1aa/kube-rbac-proxy/0.log" Apr 22 19:35:11.924283 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:11.924256 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5mzdz_5453deb9-8135-4a14-8b07-385e16aad1aa/exporter/0.log" Apr 22 19:35:11.942884 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:11.942856 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5mzdz_5453deb9-8135-4a14-8b07-385e16aad1aa/extractor/0.log" Apr 22 19:35:13.993613 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:13.993584 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-jthj6_059b92a9-04fa-4655-885e-b791d19ead5b/server/0.log" Apr 22 19:35:17.183538 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:17.183508 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-l66ns" Apr 22 19:35:18.954434 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:18.954355 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsns5_bcd75ef0-f7af-4a32-b19b-aa29b44cd391/kube-multus-additional-cni-plugins/0.log" Apr 22 19:35:19.008103 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.008070 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsns5_bcd75ef0-f7af-4a32-b19b-aa29b44cd391/egress-router-binary-copy/0.log" Apr 22 19:35:19.026264 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.026237 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsns5_bcd75ef0-f7af-4a32-b19b-aa29b44cd391/cni-plugins/0.log" Apr 22 19:35:19.044558 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.044515 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsns5_bcd75ef0-f7af-4a32-b19b-aa29b44cd391/bond-cni-plugin/0.log" Apr 22 19:35:19.069128 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.069098 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsns5_bcd75ef0-f7af-4a32-b19b-aa29b44cd391/routeoverride-cni/0.log" Apr 22 19:35:19.087169 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.087138 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsns5_bcd75ef0-f7af-4a32-b19b-aa29b44cd391/whereabouts-cni-bincopy/0.log" Apr 22 19:35:19.104466 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.104433 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fsns5_bcd75ef0-f7af-4a32-b19b-aa29b44cd391/whereabouts-cni/0.log" Apr 22 19:35:19.474547 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.474516 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nnfk2_26feee8a-9b45-4708-a356-fcabada1a28c/kube-multus/0.log" Apr 22 19:35:19.571446 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.571406 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8xjpc_e28dd910-549e-488c-8e99-3ad3f1d11a5e/network-metrics-daemon/0.log" Apr 22 19:35:19.587985 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:19.587941 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8xjpc_e28dd910-549e-488c-8e99-3ad3f1d11a5e/kube-rbac-proxy/0.log" Apr 22 19:35:20.918457 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:20.918420 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtpk8_1c76f00a-74ae-463c-9b29-4b39f9d6a26d/ovn-controller/0.log" Apr 22 19:35:20.963958 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:20.963920 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtpk8_1c76f00a-74ae-463c-9b29-4b39f9d6a26d/ovn-acl-logging/0.log" Apr 22 19:35:21.011538 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:21.011507 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtpk8_1c76f00a-74ae-463c-9b29-4b39f9d6a26d/kube-rbac-proxy-node/0.log" Apr 22 19:35:21.032261 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:21.032185 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtpk8_1c76f00a-74ae-463c-9b29-4b39f9d6a26d/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:35:21.046265 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:21.046237 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtpk8_1c76f00a-74ae-463c-9b29-4b39f9d6a26d/northd/0.log" Apr 22 19:35:21.062750 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:21.062713 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtpk8_1c76f00a-74ae-463c-9b29-4b39f9d6a26d/nbdb/0.log" Apr 22 19:35:21.079099 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:21.079073 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtpk8_1c76f00a-74ae-463c-9b29-4b39f9d6a26d/sbdb/0.log" Apr 22 19:35:21.258544 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:21.258508 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vtpk8_1c76f00a-74ae-463c-9b29-4b39f9d6a26d/ovnkube-controller/0.log" Apr 22 19:35:22.255982 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:22.255938 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-h9t7j_64bb8453-8f86-4f40-ab06-b6f7eb42265e/network-check-target-container/0.log" Apr 22 19:35:23.098666 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:23.098636 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lvpr2_28f9f4d1-15dc-40be-a8db-e7a35cb819c1/iptables-alerter/0.log" Apr 22 19:35:23.773935 ip-10-0-138-84 kubenswrapper[2572]: I0422 19:35:23.773891 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-9q6ht_07979851-c8ab-4500-998e-e7498964b0a7/tuned/0.log"